Meta asks if it can let people post coronavirus misinformation on Facebook and Instagram

The company now wants to address misinformation by leaving it up but ‘labeling or demoting it’

Adam Smith
Wednesday 27 July 2022 13:04 BST
Comments
(Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Meta has asked its oversight board whether its measures against coronavirus misinformation should stay in place.

The company, which owns Facebook, Instagram, and WhatsApp, initially only removed misinformation when local partners with relevant expertise told it a particular piece of content (like a specific post on Facebook) could contribute to a risk of imminent physical harm.

Eventually, its policies were expanded to remove entire categories of false claims on a worldwide scale

Now, however, the company has asked the board - which has 20 members including politicians, lawyers, and academics and is funded by a $130m trust from the social media giant - whether it should “address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program.”

In general, Meta’s policies of removing content had mixed results due to its questionable effectiveness.

Researchers running experiments on the platform found that two brand-new accounts they had set up were recommended 109 pages containing anti-vaccine information in just two days.

Now, however, Meta’s president of global affairs and former UK deputy prime minister Nick Clegg says that “life is increasingly returning to normal” in some countries.

“This isn’t the case everywhere and the course of the pandemic will continue to vary significantly around the globe — especially in countries with low vaccination rates and less developed healthcare systems. It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in.”

Meta is asking for guidance because “resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic”, he wrote.

During the pandemic, Meta’s head of virtual reality Andrew Bozworth said that "individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing," adding that he did not "feel comfortable at all saying they don’t have a voice because I don’t like what they said."

He went on: “If your democracy can’t tolerate the speech of people, I’m not sure what kind of democracy it is. [Facebook is] a fundamentally democratic technology”.

study conducted by the non-profit Centre for Countering Digital Hate and Anti-Vax Watch suggested that close to 65 per cent of the vaccine-related misinformation on Facebook was coming from 12 people. Researchers also said that recommendation algorithms were at the heart of the problem, which are still generally designed to boost content that engages the most people, regardless of what it is - even conspiracy theories.

“For a long time the companies tolerated that because they were like, ‘Who cares if the Earth is flat, who cares if you believe in chemtrails?’ It seemed harmless,” said Hany Farid, a misinformation researcher and professor at the University of California at Berkeley.

“The problem with these conspiracy theories that maybe seemed goofy and harmless is they have led to a general mistrust of governments, institutions, scientists and media, and that has set the stage of what we are seeing now.”

In a statement, the Center for Countering Digital Hate, said that Meta’s request to its oversight board was “designed to distract from Meta’s failure to act on a flood of anti-vaccine conspiracy theories spread by opportunistic liars” during the coronavirus pandemic.

“CCDH’s research, as well as Meta’s own internal analysis, shows that the majority of anti-vaccine misinformation originates from a tiny number of highly prolific bad actors. But Meta has failed to act on key figures who are still reaching millions of followers on Facebook and Instagram”, Callum Hood, head of research at the CCDH, said.

“Platforms like Meta should not have absolute power over life-and-death issues like this that affect billions of people. It’s time people in the UK and elsewhere are given democratic oversight of life-changing decisions made thousands of miles away in Silicon Valley.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in