Video-sharing platform YouTube has announced sweeping and immediate bans on false claims that vaccines are dangerous and cause health issues like autism, cancer or infertility
YouTube announced a sweeping crackdown of vaccine misinformation Wednesday that booted popular anti-vaccine influencers from its site and deleted false claims that have been made about a range of immunizations.
YouTube’s latest attempt to stem a tide of vaccine misinformation comes as countries around the globe struggle to convince a somewhat vaccine hesitant public to accept the free immunizations that scientists say will end the COVID-19 pandemic that began 20 months ago. The tech platform, which is owned by Google, already tried to ban COVID-19 vaccine misinformation last year, at the height of the pandemic.
In an emailed statement to The Associated Press, Kennedy criticized the ban: “There is no instance in history when censorship and secrecy have advanced either democracy or public health.”
YouTube declined to provide details on how many accounts were removed in the crackdown.
Under its new policy, YouTube says it will remove misinformation about any vaccine that has been approved by health authorities, such as the World Health Organization, and is currently being administered. False claims that those vaccines are dangerous or cause health issues, like cancer, infertility or autism — theories that scientists have discredited for decades but have endured on the internet — should also be removed.
“The concept that vaccines harm — instead of help — is at the foundation of a lot of misinformation,” said Jeanine Guidry, a media and public health professor at Virginia Commonwealth University School of Medicine.
She added that, if enforced properly, the new rules could stop bad information from influencing a new parent who is using the internet to research whether or not to vaccinate their child, for example.
But, as is common when tech platforms announce stricter rules, loopholes remain for anti-vaccine misinformation to spread on YouTube.
Claims about vaccines that are being tested will still be allowed. Personal stories about reactions to the vaccine will also be permitted, as long as they do not come from an account that has a history of promoting vaccine misinformation.
Despite tech companies announcing a string of new rules around COVID-19 and vaccine misinformation during the pandemic, falsehoods have still found big audiences on the platforms.
In March, Twitter began labelling content that made misleading claims about COVID-19 vaccines and said it would ban accounts that repeatedly share such posts. Facebook, which also owns Instagram, had already prohibited posts claiming COVID-19 vaccines cause infertility or contain tracking microchips, and in February announced it would similarly remove claims that vaccines are toxic or can cause health problems such as autism.
Yet popular anti-vaccine influencers remain live on Facebook, Instagram and Twitter, where they actively use the platforms to sell books or videos. On Facebook and Instagram alone, a handful of anti-vaccine influencers still have a combined 6.4 million followers, according to social media watchdog group the Center for Countering Digital Hate. And COVID-19 vaccine misinformation has been so pervasive on Facebook that President Joe Biden in July accused influencers on the platform of “killing people” with falsehoods about the COVID-19 vaccine.
Other platforms have taken a harder line. Pinterest, for example, prohibited any kind of vaccine misinformation even before the pandemic began. Now, if users search for content about vaccines on the site, they are directed to visit authoritative websites operated by the Centers for Disease Control and Prevention, and the WHO. ———
Associated Press writers David Klepper in Providence, Rhode Island, and Barbara Ortutay in Oakland, California, contributed to this report.