` `

YouTube Implements Comprehensive Strategy to Combat Cancer Treatment Misinformation

Wesam Abo Marq Wesam Abo Marq
Technology
21st August 2023
YouTube Implements Comprehensive Strategy to Combat Cancer Treatment Misinformation
YouTube will delete content opposing health expert advice (Getty)

On August 15, YouTube introduced a new strategy to combat the spread of false medical information on its video-sharing platform. The company's approach involves removing content that contradicts established advice from health authorities on subjects like COVID-19, reproductive health, cancer, hazardous substances, and more.

YouTube Unveils New Policy to Counter Medical Misinformation

In an official blog post on Tuesday, YouTube revealed its latest strategy to combat medical misinformation on its video-sharing platform. The company's plan involves restructuring its current regulations into three main categories: prevention, treatment, and denial.

"Moving forward, YouTube will streamline dozens of our existing medical misinformation guidelines to fall under three categories – Prevention, Treatment, and Denial. These policies will apply to specific health conditions, treatments, and substances where content contradicts local health authorities or the World Health Organization (WHO)," the blog post stated.

A supporting image within the article body
Photo Description: A screenshot of the YouTube’s blog post.

In this process, YouTube will remove content that contradicts well-established advice from health authorities concerning subjects such as COVID-19, reproductive health, cancer, and harmful substances, among others.

The company stated, "While specific medical guidance can change over time as we learn more, our goal is to ensure that when it comes to areas of well-studied scientific consensus, YouTube is not a platform for distributing information that could harm people."

The Misinformation Policy’s Framework

The presented framework includes three categories: prevention misinformation, treatment misinformation, and denial misinformation. Content contradicting health authority guidance on the prevention and transmission of specific health conditions, as well as the safety and efficacy of approved vaccines, will be removed under prevention misinformation. Treatment misinformation pertains to content that contradicts health authority guidance on treatments for specific health conditions, including the promotion of harmful substances or practices. Denial misinformation involves the removal of content denying the existence of particular health conditions, such as content disputing COVID-19-related deaths.

A supporting image within the article body
Photo Description: A screenshot of the YouTube’s blog post.

YouTube Takes Action Against Cancer Treatment Misinformation

YouTube plans to evaluate whether a condition aligns with its revised medical policy by considering if it poses a significant public health risk that frequently attracts misinformation. Using cancer as an example, the company highlighted the tendency for individuals to seek guidance from platforms like YouTube following a diagnosis.

“When cancer patients and their loved ones are faced with a diagnosis, they often turn to online spaces to research symptoms, learn about treatment journeys, and find community. Our mission is to make sure that when they turn to YouTube, they can easily find high-quality content from credible health sources,” the blog stated.

In practice, this implies that content discouraging proven treatments or endorsing unverified remedies will be deleted, according to the blog post.

YouTube to Allow Certain Content Despite Policy Violations

However, YouTube clarified that content deemed to be of public interest might still be accessible, even if it goes against the updated policy. For example, if a political candidate challenges official health advice or if a public hearing presents inaccurate information, YouTube might choose not to take it down.

In such cases, the company intends to augment videos with supplementary context to assist viewers in understanding the situation better.

YouTube As a Source of Misinformation During COVID-19

YouTube, which is owned by Google, has faced historical challenges in effectively moderating the content uploaded to its platform. In 2020, a former YouTube moderator filed a lawsuit against the company, alleging that content moderators often held their positions for less than a year due to chronic understaffing.

A supporting image within the article body
Photo Description: A Youtube logo seen displayed on a smartphone with a computer model of the COVID-19 coronavirus in the background (Getty)

According to a research paper that has been published, around 11% of YouTube's most-watched videos related to COVID-19 vaccines, totaling 18 million views, presented information contradicting guidance from the World Health Organization (WHO) or the Centers for Disease Control and Prevention (CDC). 

Conversely, videos containing non-factual content received approximately 14 times more likes than dislikes on average.

Read More

YouTube To Remove Abortion Misinformation Content

YouTube Introduces New Measures Against Misinformation