Bailey Calfee
Sep 12, 2023

ChildFund PSA calls out AI’s role in the proliferation of child sexual abuse material online

The campaign features a predator who describes how “there’s never been a better time to be a monster.”

(Photo credit: ChildFund International / YouTube)
(Photo credit: ChildFund International / YouTube)
AI offers endless possibilities for working more efficiently and uncovering new ideas. It has also made it much easier to create child sexual abuse material (CSAM).
 
In a disturbing new campaign titled #TakeItDown, the nonprofit ChildFund International illustrates just how easy it is for predators to “hide in plain sight” thanks to the internet and tech companies’ limited responsibilities when it comes to reporting and taking down CSAM. 
 
As reported by the Washington Post in June, AI-generated CSAM has increased month-over-month since AI tools became more widely available to the public in Fall 2022. And while every U.S. state’s attorney general has signed a petition for Congress to further study and create guards against the proliferation of this content, there are still limited barriers to keep this type of content from being generated or distributed.
 
A video spot, created in partnership with social impact agency Wrthy, features a predator whose face quickly switches from unassuming when speaking to his kids or colleagues into a pale monster when using his computer. 
 
 
“If you ask me, there’s never been a better time to be a monster,” he says after describing how technology has allowed predators the increased ability to review, rate and recommend CSAM. 
 
The video is part of a multimedia campaign and also includes a widget which allows viewers to add their voices to demand action from policymakers, as well as a mini-documentary. Included in the short film are Sonya Ryan, whose daughter was murdered by an online predator, and Jim Cole, a retired Homeland Security Investigations special agent with knowledge of the technology tools that exist but aren’t used.
 
“Instead of being a place for learning, playing and connecting with friends and family, the internet has become a place rife with ways to exploit and abuse children,” said Erin Kennedy, ChildFund International’s vice president of external affairs and partnerships, in a press release.
 
Tech companies are not legally required to search for CSAM shared or held via their platforms. While they are required to report it once they have been made aware of its existence, these companies are not typically punished for neglecting to quickly remove it.
 
The National Center for Missing and Exploited Children noted in a report earlier this year that it receives around 80,000 reports to its Cyber TipLine each day, with a majority of the CSAM reported living on the open web (as opposed to the dark web, which is much harder to access). 
 
This campaign puts the onus on tech companies, whose platforms serve as hosts for this media. “We want technology companies to recognize their responsibility,” noted Kennedy. “Profit should not come before the protection and well-being of children.”

 

Source:
Campaign US
Tags

Related Articles

Just Published

6 hours ago

Why otome is the new go-to for gaming collaborations...

Like all simulation games, Otome offers a fantasy. The powerful appeal of that fantasy speaks to what many young Chinese women feel is lacking in reality: a sense of power.

7 hours ago

Campaign Global Forecast Q2 2024: Tech brands' ad ...

The troubles in the tech world cast a shadow on the industry in 2023. Will the clouds clear in 2024?

7 hours ago

Ogilvy Hong Kong, Dentsu Japan and The Monkeys in ...

The global awards show awarded top honours to campaigns hailing from 15 different markets worldwide.

7 hours ago

Agency Report Cards 2023: We grade 31 APAC networks

Campaign Asia-Pacific presents its 21st annual evaluation of APAC agency networks based on their 2023 business performance, innovation, creative output, awards, action on DEI and sustainability, and leadership.