Oversight Board to examine Facebook posts about summer riots
The board has confirmed it is looking at Facebook’s handling of the reporting of three posts linked to the summer riots.
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The Oversight Board which examines content moderation decisions made by Meta’s social platforms is to look at three cases linked to posts shared during the summer riots in the UK.
Violence erupted across the country after a knife attack in Southport which killed three girls and injured eight others, fuelled by misinformation spreading rapidly on social media about the attacker’s identity, including false claims that he was an asylum seeker who had arrived in the UK on a small boat.
There have since been calls to tighten online safety laws to better respond to misinformation and disinformation because of the real world impact it can have.
The Oversight Board has now confirmed it will look at cases involving three posts from that time which were reported to Facebook for violating either its hate speech or violence and incitement policies.
The first post expressed agreement with the riots, called for mosques to be attacked and buildings to be set on fire which housed migrants.
The second piece of content was a reshare of another post. It showed what appeared to be an AI-generated image of a giant man wearing a Union flag T-shirt who is chasing several Muslim men, and included overlay text providing details of when and where to meet for one of the protests.
The third post is another AI-generated image, of four Muslim men, running in front of the Houses of Parliament after a crying blond-haired toddler in a Union flag T-shirt, with the image carrying the caption “wake up”.
All three posts were originally kept on Facebook after being assessed by Meta’s automated tools – none of the posts were reviewed by humans – before the same users who had reported the posts appealed to the Oversight Board over the decision.
The board said it had selected these cases to examine Meta’s policy preparedness and crisis response to violent riots targeting migrant and Muslim communities.
It said that as a result of selecting these cases, Meta has now determined that its previous decision to leave the first post on Facebook was an error and has removed it.
The social media giant confirmed to the board it still believes its decisions to leave the second and third post on Facebook was correct.
The Oversight Board said it would now accept public comments on the issue, including the role social media played in the UK riots and the spreading of misinformation.
It is expected to issue decisions on the cases in the coming weeks, and can make policy recommendations to Meta, which although not binding, must be responded to by the tech giant within 60 days.