YouTube bans all anti-vax videos
Your support helps us to tell the story
This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.
The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.
Help us keep bring these critical stories to light. Your support makes all the difference.
YouTube is to ban all anti-vax videos and other misinformation about vaccines, it has announced.
The step is the latest to be taken by the video service as it struggles to decide its policy on misinformation amid the coronavirus pandemic.
YouTube is just one of many services that have gradually tightened their policies on coronavirus misinformation, as anti-vax and other misleading content spreads across the internet.
The Google-owned video platform said its ban on Covid-19 vaccine misinformation, which was introduced last year, had seen 130,000 videos removed so far as a result, but more scope was needed to clamp down on broader false claims about other vaccines appearing online.
Under the new rules, any content which falsely alleges that any approved vaccine is dangerous and causes chronic health problems will be removed, as will videos that include misinformation about the content of vaccines.
Social media and internet platforms have been repeatedly urged to do more to tackle the spread of online misinformation, and although millions of posts have been blocked or taken down and a number of new rules and prompts to official health information have been introduced across most platforms, critics have suggested not enough has been done to slow the spread of harmful content since the start of the pandemic.
YouTube said it was taking its latest action in response to seeing vaccine misinformation begin to branch out into other false claims.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” YouTube said in a blog post announcing the rule update.
“Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed.
“This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them. Our policies not only cover specific routine immunisations like for measles or Hepatitis B, but also apply to general statements about vaccines.”
YouTube added that there would be “important exceptions” to the new guidelines, including content about “vaccine policies, new vaccine trials and historical vaccine successes or failures”, as well as personal testimonies relating to vaccines, which the company said were important parts of public discussion around the scientific process.
“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community,” the company said.
Additional reporting by Press Association
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments