Thorn, a nonprofit that builds technology to combat child sexual exploitation, launched a tool on Wednesday to help small- and medium-size companies find, remove and report child sexual abuse material.
The tool, called Safer, is designed for companies that lack the resources to build their own systems.
“To eliminate the trade of this material and stop the revictimization of children in these images and videos you need to have blanket coverage of tech companies,” said Julie Cordua, CEO of Thorn, which was co-founded by the actor and investor Ashton Kutcher.
Cordua noted that while large platforms like Facebook, YouTube and Twitter have the employees and motivation to build their own tools for detecting this material, smaller companies don’t.
“It’s expensive, it’s a heavy lift operationally and it requires specialist knowledge,” she said. “We saw this gap and thought we could build a shared system to get the rest of the technology ecosystem on board to detect and remove child sexual exploitation material.”
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
At its launch on Wednesday, Safer is in use by 10 customers including the image-sharing sites Flickr, Imgur and VSCO; the workplace communication tool Slack; the blogging platform Medium; the video-sharing site Vimeo; and the web-hosting company GoDaddy. According to Thorn, Safer has already detected 100,000 child sexual abuse images on these platforms.
The launch of Safer comes as record amounts of child sexual abuse material circulates online. In 2019, more than 69 million images and videos were reported to the National Center for Missing and Exploited Children.
Safer supplies customers with a large dataset of child sexual abuse images that have been reviewed by law enforcement and converted into digital fingerprints called “hashes.” These hashes can be used to search online platforms for copies of the original images. Once the system finds these copies, or if someone tries to upload them, it automatically deletes and reports them to the National Center for Missing and Exploited Children.
The tool also uses machine learning to proactively search for suspected child sexual exploitation material that hasn’t already been reviewed by law enforcement and flag it for human review. If the human content reviewer agrees that the image has been flagged correctly, the reviewer can report, delete and prevent anyone else from uploading it.
Any newly reported material gets added to the “Saferlist” of hashes, which all other Safer customers can use to prevent images that have been deleted from other platforms from being reuploaded to theirs.
Imgur was the first company to pilot a prototype of Safer in January 2019. It took just 20 minutes for the tool to find the first image, and when human reviewers investigated the account further they discovered an additional 231 files to be reported and deleted.
“People are really afraid to talk about it, but this is a problem that all tech companies are facing,” said Sarah Schaaf, co-founder of Imgur. “We all know this is an issue worth combatting but, when you are a smaller or mid-sized company, it’s tough. It requires a large financial investment and experts who know what to do.”
Safer provides access to the expertise and infrastructure to deal with these illegal images without having to build and maintain the technology in house, Schaaf said.
Imgur was reluctant to talk about this issue publicly over fears that people would think it was a problem specific to the company, rather than industry wide.
“It’s an evil topic and you hate to even imagine that it exists on your platform, but that’s why we need to make the tools to fight it and work towards fixing it,” she said. “But you need to get past worrying what people might think and realize this is bad for your platform and humanity.”