YouTube is set to remove content that promotes ineffective or harmful cancer treatments and discourages professional medical care-seeking. The decision is part of YouTube’s effort to refine its medical moderation guidelines, following experiences combating misinformation on subjects like COVID-19, vaccines, and reproductive health.

In the future, Google’s video platform will apply medical misinformation policies when addressing high public health risks, relying on established health authority guidance, and countering potential misinformation-prone topics. This approach aims to cover a broad range of medical subjects while maintaining a balance between harm reduction and open debate.

YouTube clarified its stance in a recent blog post, emphasising action against both harmful treatments and unproven alternatives presented as substitutes for established options. Notably, promoting vitamin C supplements as a replacement for radiation therapy would be prohibited.


These policy updates come over three years after YouTube collaborated with major tech platforms to combat COVID-19 misinformation. While the platform previously acted against vaccine misinformation, it intensified efforts during the pandemic, eventually banning all vaccine misinformation by late 2021.

YouTube also took steps against other videos violating its medical misinformation policy, including those endorsing “unsafe abortion methods” or spreading “false claims about abortion safety.”

Divergence among major tech platforms’ approaches to COVID-19 misinformation emerged after their initial united front in early 2020. Twitter stopped enforcing its COVID misinformation policy in late 2022 due to an acquisition by Elon Musk. Meta also recently relaxed its moderation approach, particularly in countries like the US, where COVID-19 is no longer a national emergency.