Facebook’s ‘Supreme Court’ tackles nudity, Nazi quotes, and Covid misinformation in first cases
The cases focus on the Facebook’s hate speech, adult nudity, and dangerous organisations policies
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook’s Oversight Board, which was established to review the social media giant’s moderation decisions, has accepted its first cases.
“More than 20,000 cases were referred to the Oversight Board following the opening of user appeals in October 2020”, the board said in its announcement.
“As the Board cannot hear every appeal, we are prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook's policies.”
The six appeals, five of which were referred by users, focus on the company’s policies on hate speech, adult nudity, and dangerous individuals and organisations.
This includes images of dead children posted alongside a criticism of China for its treatment of Uyghur Muslims, an image posted on Instagram of female breasts to raise awarenesss of signs of breast cancer, and a quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany, that was used to criticise the Trump administration.
Each of the cases will be assigned to a five-member panel that includes one person from the region of which the content came. The board will deliberate on the case and Facebook will act on their decision within 90 days.
The case that Facebook submitted to the board was a video criticising French health officials for not authorising hydroxychloroquine as a cure for the coronavirus, which was viewed 50,000 times and shared 1,000 times.
Facebook removed the video for violating its policy on violence and incitement, and referred it to the board as “an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.”
The Oversight Board, which CEO Mark Zuckerberg has compared to a Supreme Court for the social media site, has the power to overrule decisions made by Facebook about content moderation, as well as influence new policy.
Each board member will serve no longer than three years, and currently includes journalists, federal judges, law professors, and the former Prime Minister of Denmark, Helle Thorning-Schmidt.
The development of Facebook’s Oversight Board comes as the company has been repeatedly criticised for its moderation policies.
The company’s algorithm was found to be “actively recommending” Holocaust denial and fascism according to research from the Institute for Strategic Dialogue (ISD), and misinformation from president Donald Trump was the most popular post on the social media site despite its attempt to move users towards more reputable sources of information.
One former Facebook employee, Sophie Zhang, also said the company had been ignoring evidence that fake accounts on its platform have been disrupting political events across the world.
“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” wrote Zhang. “I know that I have blood on my hands by now.”
Another Facebook engineer previously resigned, claiming that the company was “profiting off hate in the US and globally” because of inaction against violent hate groups and far-right militias using Facebook to recruit members.
“We don’t benefit from hate," a Facebook spokesperson told The Independent in a statement at the time.
“We invest billions of dollars each year to keep our community safe and are in deep partnership with outside experts to review and update our policies. This summer we launched an industry leading policy to go after QAnon, grew our fact-checking program, and removed millions of posts tied to hate organizations — over 96 per cent of which we found before anyone reported them to us.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments