The online video sharing and social media platform YouTube announced Thursday that it is expanding its medical misinformation policies with new guidelines on currently administered vaccines that have been approved and confirmed to be safe and effective by local health authorities and the World Health Organization (WHO).
In an official blog post, YouTube, which is owned by Google, said that creating policies around medical disinformation comes with unique difficulties and costs.
It is said that as new data emerges, scientific knowledge develops, and firsthand, personal experience often plays a significant part in online debate. Vaccines, in particular, have been the subject of heated controversy throughout the years, despite constant advice from health authorities about their efficacy.
“Our Community Guidelines already prohibit certain types of medical misinformation. We’ve long removed content that promotes harmful remedies, such as saying drinking turpentine can cure diseases. At the onset of COVID-19, we built on these policies when the pandemic hit, and worked with experts to develop 10 new policies around COVID-19 and medical misinformation. Since last year, we’ve removed over 130,000 videos for violating our COVID-19 vaccine policies,” Youtubes official blog post said.
Youtube said throughout the project, it gained valuable insights about how to create and implement complex medical misinformation rules at scale. Working together with health officials, it sought to strike a balance between its commitment to an open platform and the need to remove extremely hazardous material.
It went on to explain that it has seen erroneous claims about coronavirus vaccinations spread into vaccine misinformation in general, and it’s now more essential than ever to extend the work we began with COVID-19 to other vaccines.
“Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of the disease, or contains misinformation on the substances contained in vaccines will be removed,” Youtube said.
“This would include content that falsely says that approved vaccines cause autism, cancer, or infertility, or that substances in vaccines can track those who receive them. Our policies not only cover specific routine immunizations like for measles or Hepatitis B but also apply to general statements about vaccines,” it added.
In establishing rules to combat disinformation, YouTube claimed it reviewed its COVID standards by consulting with local and international health groups.
One of its new vaccine adverse effect guidelines maps to public vaccination resources given by health authorities and supported by medical consensus. Youtube said these policy changes will take effect immediately, and as with any major change, it will take some time for our systems to completely ramp up enforcement.
“There are important exceptions to our new guidelines. Given the importance of public discussion and debate to the scientific process, we will continue to allow content about vaccine policies, new vaccine trials, and historical vaccine successes or failures on YouTube,” it said.
“Personal testimonials relating to vaccines will also be allowed, so long as the video doesn’t violate other Community Guidelines, or the channel doesn’t show a pattern of promoting vaccine hesitancy,” it added.
Youtube said all of this adds to its continuing efforts to elevate authoritative health information on its platform and link consumers with trustworthy, high-quality health content and sources.
“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community,” it ended.