YouTube tightens approach to eating disorder-related content


YouTube said it will implement a revised approach to prevent eating disorder-related content on the site from fueling harmful misinformation.

(Photo credit: Getty Images).

In its latest step to battle health misinformation, YouTube said this month that it will revise its approach to eating disorder-related content.

In a blog post, the video platform said it would be updating its monitoring of such content informed by third-party experts, in a way that “creates space for community, recovery and resources, while continuing to protect viewers.”

As part of that effort, YouTube worked with organizations such as the National Eating Disorder Association to develop a framework for community guidelines, age restrictions and crisis resource panels on its videos.

The company also noted that while it has always had policies to take down content that encourages eating disorders, the updated policies will be even more specific in ensuring that some content doesn’t become triggering to people. 

In particular, YouTube will ban content that imitates disordered eating behaviors, like purging or restricting calories, or “weight-based bullying.” It will also restrict certain videos from being viewable to people under the age of 18.

The overall approach is to balance content that may detail someone’s personal journey in recovering from an eating disorder, with avoiding content that can trigger impressionable young people to partake in harmful behavior.

“We’re thinking about how to thread the needle in terms of essential conversations and information that people might have,” Garth Graham, YouTube’s global head of healthcare, told CNN. “Allowing people to hear stories about recovery and allowing people to hear educational information but also realizing that the display of that information … can serve as a trigger as well.”

The move builds upon YouTube’s recent efforts to filter misinformation on its site related to health content. 

In 2021, YouTube expanded its misinformation policies to all vaccines, not just COVID-19 ones, and said it would ban anti-vaccine content. It also made moves to bolster authoritative health content on the site by prioritizing credible resources like universities, hospitals or experts.

“[W]e wanted to move forward in terms of how the tech industry — not just YouTube, but certainly YouTube being a forward-leaning part of it — elevates credible health information,” Graham told MM+M in a previous interview. “We need to define what credible, authoritative health information is.”

Fueled by the proliferation of health misinformation during COVID-19, the issue has become a priority for lawmakers as well, with some Democrats crafting proposals that would crack down on tech giants like Facebook or Twitter if they allowed misinformation to spread.

While tangible legislation has yet to pass, tech companies such as Facebook, TikTok and YouTube have all made announcements in recent years claiming to work on addressing the issue.

However, under Elon Musk’s leadership Twitter has rescinded some of its misinformation policies, including its COVID-19 one.

Despite the recent end of the public health emergency for COVID-19, health misinformation remains a massive issue. Growing numbers of Gen Z and millennials are turning to TikTok and other social media platforms instead of their doctors. One recent study found that 18% of the U.S. population were seeking health information and guidance from social media influencers.

This story first appeared on MM+M.

Have you registered with us yet?

Register now to enjoy more articles and free email bulletins

Already registered?
Sign in

Recommended for you

Recommended for you

Explore further