AI offers endless possibilities for working more efficiently and uncovering new ideas. It has also made it much easier to create child sexual abuse material (CSAM).
In a disturbing new campaign titled #TakeItDown, the nonprofit ChildFund International illustrates just how easy it is for predators to “hide in plain sight” thanks to the internet and tech companies’ limited responsibilities when it comes to reporting and taking down CSAM.
A video spot, created in partnership with social impact agency Wrthy, features a predator whose face quickly switches from unassuming when speaking to his kids or colleagues into a pale monster when using his computer.
“If you ask me, there’s never been a better time to be a monster,” he says after describing how technology has allowed predators the increased ability to review, rate and recommend CSAM.
The video is part of a multimedia campaign and also includes a widget which allows viewers to add their voices to demand action from policymakers, as well as a mini-documentary. Included in the short film are Sonya Ryan, whose daughter was murdered by an online predator, and Jim Cole, a retired Homeland Security Investigations special agent with knowledge of the technology tools that exist but aren’t used.
“Instead of being a place for learning, playing and connecting with friends and family, the internet has become a place rife with ways to exploit and abuse children,” said Erin Kennedy, ChildFund International’s vice president of external affairs and partnerships, in a press release.
Tech companies are not legally required to search for CSAM shared or held via their platforms. While they are required to report it once they have been made aware of its existence, these companies are not typically punished for neglecting to quickly remove it.
The National Center for Missing and Exploited Children noted in a report earlier this year that it receives around 80,000 reports to its Cyber TipLine each day, with a majority of the CSAM reported living on the open web (as opposed to the dark web, which is much harder to access).
This campaign puts the onus on tech companies, whose platforms serve as hosts for this media. “We want technology companies to recognize their responsibility,” noted Kennedy. “Profit should not come before the protection and well-being of children.”
The mandatory opt-in for personalised ads was announced as part of a series of settings updates that will gradually be rolled out in the coming weeks. Select unspecified countries will still be able to opt-out.
ADHD diagnoses — and understanding of the benefits neurodivergence poses to creative industries — are on the rise. Inspired by the personal experiences of ADHD colleagues and family members, EssenceMediacom Australia’s Kellie Dawson shares how companies can better support ADHD team members.