
In its latest step to battle health misinformation, YouTube announced this week it would be revising its approach to eating disorder-related content.
In a blog post, the video platform said it would be updating its monitoring of such content informed by third-party experts, in a way that “creates space for community, recovery and resources, while continuing to protect viewers.”
As part of that effort, YouTube worked with organizations such as the National Eating Disorder Association (NEDA) to develop a framework around community guidelines, age restrictions and crisis resource panels on its videos.
The company also noted that while it has always had policies to take down content that encourages eating disorders, the updated policies will be even more specific in ensuring that some content doesn’t become triggering to people.
In particular, YouTube will ban content that imitates disordered eating behaviors — like purging or restricting calories — or “weight-based bullying.” It will also restrict certain videos from being viewable to people under the age of 18.
The overall approach is to balance content that may detail someone’s personal journey in recovering from an eating disorder, with avoiding content that can trigger impressionable young people to partake in harmful behavior.
“We’re thinking about how to thread the needle in terms of essential conversations and information that people might have,” Garth Graham, YouTube’s global head of healthcare, told CNN. “Allowing people to hear stories about recovery and allowing people to hear educational information but also realizing that the display of that information … can serve as a trigger as well.”
In 2021, YouTube expanded its misinformation policies to all vaccines, not just COVID-19 ones, and said it would ban anti-vaccine content. It also made moves to bolster authoritative health content on the site by prioritizing credible resources like universities, hospitals or experts.
“[W]e wanted to move forward in terms of how the tech industry — not just YouTube, but certainly YouTube being a forward-leaning part of it — elevates credible health information,” Graham told MM+M in a previous interview. “We need to define what credible, authoritative health information is.”
Fueled by the proliferation of health misinformation during COVID-19, the issue has become a priority for lawmakers as well, with some Democrats crafting proposals that would crack down on tech giants like Facebook or Twitter if they allowed misinformation to spread.
While tangible legislation has yet to pass, tech companies such as Facebook, TikTok and YouTube have all made announcements in recent years claiming to work on addressing the issue.
However, under Elon Musk’s leadership Twitter has rescinded some of its misinformation policies, including its COVID-19 one.
Despite the recent end of the public health emergency for COVID-19, health misinformation remains a massive issue. Growing numbers of Gen Z and millennials are turning to TikTok and other social media platforms instead of their doctors. One recent study found that 18% of the U.S. population were seeking health information and guidance from social media influencers.