Rahul Sachitanand
Aug 14, 2020

Audit assesses social-media platforms' progress toward responsibility

From policy enforcement to misinformation, a study conducted by IPG Mediabrands agency Reprise highlights the progress (or lack thereof) social-media platforms are making on brand safety.

Audit assesses social-media platforms' progress toward responsibility

In an era of rapid social-media growth and a similarly rapid uptick in misinformation, an IPG Mediabrands study checks into the progress social platforms are making, or failing to make, when it comes to providing a safe brand envrionment.

Led by Mediabrands’ performance-marketing agency, Reprise, the study reveals that although social platforms have made some progress on improving their performance and perception, they have a long way to go.

“What this audit shows is that there is work to be done across all platforms from a media responsibility perspective, and that the different platforms each need to earn their place on a brand’s marketing plan,” says Elijah Harris, global head of Social, Mediabrands’ agency Reprise. “The audit is a tool to hold platforms accountable for improving their media responsibility policies and enforcement and to ensure we can track progress over time.”

The Media Responsibility Audit, based on the Media Responsibility Principles Mediabrands recently released, includes an assessment of social-media platforms (Facebook, LinkedIn, Pinterest, Reddit, Snapchat, TikTok, twitch, Twitter, and YouTube) against the 10 principles to check current status and accountability against each principle. The audit consisted of 250 questions and focused on establishing a benchmark on what a responsible platform looks like.

Here's what we learned from this audit: 

  • YouTube tops the overall rankings and performs best against several principles. This is a testament to the changes YouTube has made in response to advertiser brand-safety concerns three years ago. In contrast, the embattled Tiktok has the most work to do. 
  • Platforms fall short by not backing up their policies with consistent enforcement of those policies. Most platforms have some level of enforcement reporting, but these are inconsistent and limited in scope. 
  • Misinformation is a challenge across most platforms. While certain platforms work with many organisations to combat misinformation, others work with none at all. The audit showed that even minor instances can lead to unsafe ad placement for advertisers.
  • Marketers are wrestling with the challenge of dealing with ad placements for non-registered users, since this experience varies across platforms. 
  • There is an urgent need for third-party verification, given the massive amount of disinformation on these platforms. Worryingly, few platforms have controls for protecting advertisers from adjacency to content in objectionable or harmful categories (as in GARM’s brand-safety framework). The audit shows that the industry needs to promote and use third-party verification partners more widely.

 

Related Articles

Just Published

11 hours ago

Mid-level female creatives don't feel 'heard'

This International Women's Day, we ask mid-level female creatives in the region what their biggest pain points are working in a male-dominated field.

11 hours ago

It's past time to eliminate gender biases in design

From surgical instruments to crash test dummies and virtual assistants, gender biases and stereotypes have been built-into products causing real harm, says the co-founder of Elephant Design.

11 hours ago

IWD campaigns: Angry, funny, clever and inspirationa...

See how organisations and agencies from around APAC have chosen to mark International Womens Day 2021.