Bailey Calfee
Sep 12, 2023

ChildFund PSA calls out AI’s role in the proliferation of child sexual abuse material online

The campaign features a predator who describes how “there’s never been a better time to be a monster.”

(Photo credit: ChildFund International / YouTube)
(Photo credit: ChildFund International / YouTube)
AI offers endless possibilities for working more efficiently and uncovering new ideas. It has also made it much easier to create child sexual abuse material (CSAM).
In a disturbing new campaign titled #TakeItDown, the nonprofit ChildFund International illustrates just how easy it is for predators to “hide in plain sight” thanks to the internet and tech companies’ limited responsibilities when it comes to reporting and taking down CSAM. 
As reported by the Washington Post in June, AI-generated CSAM has increased month-over-month since AI tools became more widely available to the public in Fall 2022. And while every U.S. state’s attorney general has signed a petition for Congress to further study and create guards against the proliferation of this content, there are still limited barriers to keep this type of content from being generated or distributed.
A video spot, created in partnership with social impact agency Wrthy, features a predator whose face quickly switches from unassuming when speaking to his kids or colleagues into a pale monster when using his computer. 
“If you ask me, there’s never been a better time to be a monster,” he says after describing how technology has allowed predators the increased ability to review, rate and recommend CSAM. 
The video is part of a multimedia campaign and also includes a widget which allows viewers to add their voices to demand action from policymakers, as well as a mini-documentary. Included in the short film are Sonya Ryan, whose daughter was murdered by an online predator, and Jim Cole, a retired Homeland Security Investigations special agent with knowledge of the technology tools that exist but aren’t used.
“Instead of being a place for learning, playing and connecting with friends and family, the internet has become a place rife with ways to exploit and abuse children,” said Erin Kennedy, ChildFund International’s vice president of external affairs and partnerships, in a press release.
Tech companies are not legally required to search for CSAM shared or held via their platforms. While they are required to report it once they have been made aware of its existence, these companies are not typically punished for neglecting to quickly remove it.
The National Center for Missing and Exploited Children noted in a report earlier this year that it receives around 80,000 reports to its Cyber TipLine each day, with a majority of the CSAM reported living on the open web (as opposed to the dark web, which is much harder to access). 
This campaign puts the onus on tech companies, whose platforms serve as hosts for this media. “We want technology companies to recognize their responsibility,” noted Kennedy. “Profit should not come before the protection and well-being of children.”


Campaign US

Related Articles

Just Published

2 days ago

WFA census shows Japan faces the highest inclusivity...

Japan trails far behind in terms of how companies deal with discrimination, negative behaviours, mental health, and diversity and inclusion perceptions, according to WFA's second WFA global census.

2 days ago

Reddit will no longer allow users to opt-out of ad ...

The mandatory opt-in for personalised ads was announced as part of a series of settings updates that will gradually be rolled out in the coming weeks. Select unspecified countries will still be able to opt-out.

2 days ago

Better Climate Store’s Greenwash can wash away ...

The multitasking product might deserve a spot in Ad Nut’s oaky home…unless Ad Nut is sensing a bit of irony.