Using Deep Neural Networks to Detect and Report Child Sexual Abuse Material online
Deep neural networks for image processing can now assist reviewers sorting through many images by prioritizing the most likely child sexual abuse material content for review.
Using the internet as a means to spread content that sexually exploits children is one of the worst abuses imaginable. That’s why since the early 2000s Google have been investing in technology, teams, and working closely with expert organizations, like the Internet Watch Foundation, to fight the spread of child sexual abuse material (CSAM) online.
Identifying and fighting the spread of CSAM is an ongoing challenge, and governments, law enforcement, NGOs and industry all have a critically important role in protecting children from this horrific crime. While technology alone is not a panacea for this societal challenge, this work marks a big step forward in helping more organizations do this challenging work at scale.
Today Google is introducing the next step in this fight: cutting-edge artificial intelligence (AI) that significantly advances their existing technologies to dramatically improve how service providers, NGOs, and other technology companies review this content at scale.
By using deep neural networks for image processing, they can now assist reviewers sorting through many images by prioritizing the most likely CSAM content for review. While historical approaches to finding this content have relied exclusively on matching against hashes of known CSAM, the classifier keeps up with offenders by also targeting content that has not been previously confirmed as CSAM. Quick identification of new images means that children who are being sexually abused today are much more likely to be identified and protected from further abuse.
Google is making this available for free to NGOs and industry partners via our Content Safety API, a toolkit to increase the capacity to review content in a way that requires fewer people to be exposed to it.
This system can help a reviewer find and take action on 700% more CSAM content over the same time period.
If you’re interested in using the Content Safety API service at your organization, then learn more about it here.