Emily Tan
Aug 2, 2017

YouTube: Machine learning doubles removals of extremist videos

More than 75 percent of videos removed over the past month were taken down before getting flagged by a human.

YouTube: Machine learning doubles removals of extremist videos

In an update on its efforts to create a brand-safe environment, YouTube reported that in the past month its use of machine learning has doubled the number of videos removed for violent extremism.

A little over a month ago, Google announced several steps it would be taking to combat extremist content on YouTube. 

According to its latest update, these efforts are paying off. For starters, its machine learning systems are getting better at identifying extremist videos. More than 75 percent of videos removed over the past month were taken down before getting flagged by a human. 

This speed is necessary considering more than 400 hours of content are uploaded to YouTube every minute. 

And accuracy is improving: "While these tools aren’t perfect, and aren’t right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed," the YouTube team wrote in a blog post. 

The technology used is reinforced by the involvement of more human experts. So far, Google has on-boarded 15 of the 50 expert NGOs and institutions, promised in its initial announcement in June, to its Trusted Flagger programme. 

YouTube also plans to be more strict in the coming weeks with videos that "aren’t illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism".

Videos that don't directly violate policies but do contain controversial religious or supremacist content will be placed in a "limited state". 

"The videos will remain on YouTube behind an interstitial, won’t be recommended, won’t be monetised, and won’t have key features including comments, suggested videos, and likes," the blog post said. 

This new treatment will be implemented in the coming weeks on desktop versions of YouTube and will be used in mobile soon after. 

YouTube has also progressed in its plans to take an active stance in countering terrorism. The platform has started rolling out features from Jigsaw’s Redirect Method to YouTube so when users search using sensitive keywords on YouTube they will be redirected towards a playlist of curated YouTube videos that "directly confront and debunk violent extremist messages if they do not take action themselves.

It's also taking this activism on the road. Last week, the UK chapter of its YouTube Creators for Change programme hosted a two-day workshop for 13-18 year-olds aimed at helping them learn how to participate safely and responsibly on the internet.

YouTube has pledged to expand the program’s reach to 20,000 more teens across the UK.

Today's announcement from YouTube comes at the same time as UK home secretary Amber Rudd's visit to Silicon Valley this week. Rudd has visited California to warn tech giants that the UK could introduce new laws to clamp down on extremist content. 

Campaign UK

Related Articles

Just Published

16 hours ago

How to prepare for hybrid commerce: Chinese ...

As consumers seamlessly hop between physical and online, brands are expected to provide real-time stock information and personalised experiences across all of their touchpoints. But they must demonstrate a value exchange to consumers to collect the data they need.

17 hours ago

Data shows brands don’t need social media accounts ...

Data from a Jing Daily report shows that luxury brands no longer rely on their own social media accounts in China with more engagement relying on KOLs.

17 hours ago

Apple debuts 2022 Chinese New Year film (clear some ...

The company's offering for this year is a 23-minute epic—shot on iPhones—about the making of an epic film within the film, also shot on iPhones.

17 hours ago

How women’s health brands communicate on social ...

Female founders of women’s health brands say censorship makes it challenging to properly address women’s concerns.