YouTube will start banning videos spreading disinformation and conspiracy-related content about vaccines for Covid-19 and other ailments.
It said this was part of an effort to increase safety for viewers and cut out dangerous influences.
In a blog post titled, “Managing harmful vaccine content on YouTube,” the Google-owned platform said it is banning misinformation about any vaccine that’s been approved by reputable health authorities or is in global distribution.
“We’ve long removed content that promotes harmful remedies, such as saying drinking turpentine can cure diseases,” YouTube wrote. “We built on these policies when the pandemic hit, and worked with experts to develop 10 new policies around Covid-19 and medical misinformation.”
“We’re now at a point where it’s more important than ever to expand the work we started with Covid-19 to other vaccines,” the platform added. “Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of the disease, or contains misinformation on the substances contained in vaccines will be removed.”
YouTube said it has removed more than 130,000 videos that helped spread misinformation since last year.
The platform has faced increasing pressure and criticism since the pandemic escalated last year and vaccines were in development. Many critics have said misinformation about the coronavirus vaccine has been widely amplified by social media sites like YouTube, Facebook and Twitter.
“Developing robust policies takes time,” Matt Halprin, YouTube vice president of global trust and safety, told The Washington Post. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”
As part of the ban, YouTube said it’s also terminating accounts belonging to prominent spreaders of anti-vaccine material, like Robert F. Kennedy Jr. and Joseph Mercola, neither of whom have any formal training or education in vaccinology.