Simon Gwynn
Oct 12, 2021

Facebook to encourage teens to ‘take a break’ from Instagram, Nick Clegg says

Social media giant’s VP of global affairs also endorsed greater regulation of tech platforms.

Nick Clegg: interviewed on CNN
Nick Clegg: interviewed on CNN

Facebook will introduce three new tools to tackle harmful use of Instagram by teens, its vice-president of global affairs Nick Clegg said in an interview yesterday (10 September).

Speaking on CNN’s State of the Union, Clegg said the company would introduce a feature called “take a break”, prompting young users to log off Instagram where evidence suggests they could be using it in a potentially harmful way.

Alongside this, it will prompt teen users who are looking at the same content repeatedly to look at something different, and introduce new controls for their parents.

“We're going to introduce new controls for adults of teens, on an optional basis obviously, so that adults can supervise what their teens are doing online,” he said. 

“Secondly, we're going to introduce something which I think will make a considerable difference, which is where our systems see that a teenager is looking at the same content over and over again and it’s content which may not be conducive to their wellbeing, we will nudge them to look at other content.”

In other circumstances, “we will be prompting teens to simply just take a break from using Instagram, and I think these are exactly the kinds of things which are in line with ongoing work we've been doing in co-operation with experts for many years". He added: "We clearly want to redouble our efforts going forward.”

The moves, which come alongside a decision to pause development on the planned teen-focused platform Instagram For Kids, follow Facebook whistleblower Frances Haugen giving evidence to the US Senate last week, in which she accused the company of prioritising “profit over safety”.

On 25 October, Haugen will give evidence to the UK Parliament’s Joint Committee on the draft Online Safety Bill, through which the government plans to create a new regulatory framework to tackle harmful online content.

Conservative MP Damian Collins, who chairs the Committee, said: “Frances Haugen's evidence has so far strengthened the case for an independent regulator with the power to audit and inspect the big tech companies.

“Mark Zuckerberg and Nick Clegg have complained that she has only presented a partial view of the company, whereas they want to stop any insights into how the company manages harmful content to get into the public domain, unless they have personally approved them.

“There needs to be greater transparency on the decisions companies like Facebook take when they trade off user safety for user engagement. We look forward to discussing these issues with Frances Haugen.”

CNN host Dana Bash used the interview to ask about Clegg’s views on greater regulation of tech companies—something Facebook has long said it supports.

On possible legislation requiring parental consent for children under 16 to use social media, Clegg said: “Of course, if lawmakers want to set down rules for us and for TikTok, YouTube and Twitter about exactly how young people should operate online, we, of course, will abide by the law and I think it's right that this is a subject of great bipartisan interest and discussion, because there's nothing more important to any of us than that than our kids, and I think, by the way, regulation would be very useful in many ways.”

He also endorsed the idea of providing access to Facebook’s algorithms for regulators. “Yes, we need greater transparency so the systems that we have in place… including not only the 40,000 people we employ on this but also the multibillion-dollar investments we've made into algorithmic [and] machine learning systems, should be held to account—if necessary by regulation—so that people can match what our systems say they're supposed to do from what actually happens.”

But he spoke in defence of employing algorithms to order the content users see, after Haugen claimed that Facebook’s use of them was dangerous.

“If you remove the algorithms, which is I think one of [Haugen’s] central recommendations, the first thing that would happen is that people would see more, not less, hate speech, more, not less misinformation, because these algorithms are designed precisely to work almost like giant spam filters to identify and deprecate bad content, and you know I really do think we should remember that technology, of course it has downsides, but also has very powerful, positive effects.”

Campaign UK

Related Articles

Just Published

4 hours ago

Olympics and Paralympics 2024 round-up: Watch the ads

Campaign takes a look at the ads around the Olympic and Paralympic Games Paris 2024.

6 hours ago

'Read this ad with your nose': The state of scent ...

Scent is increasingly being used by brands as a way to experiment with multi-sensorial concepts. Campaign explores how marketers can use scent effectively without causing too much of a stink.

6 hours ago

Creative Minds: FCB's Alan Jones and Angelo An on ...

The duo shares their journey from chipping concrete and retail jobs to creating award-winning campaigns, and thriving best in undisturbed harmony.

7 hours ago

Asia-Pacific Power List 2024: Hanks Lee, AS Watson ...

A marketing tour de force with a gift for crafting resonant campaigns, Hanks Lee inspires positive change through strategic and purposeful marketing