Facebook is auto-generating videos to thank terrorists for using it, investigation finds
Videos including executions and severed heads were able to get through social network's algorithms
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook is still allowing terrorist content onto its platform – and even automatically generating videos thanking people for posting it, according to a new investigation.
The company claims to be able to catch 99 per cent of some kinds of terrorist propaganda on the site, using automated systems that are able to spot and flag it.
But a new report claims that horrifying content – including executions and severed heads – is still making it through those automated systems and is readily available on the platform.
What's more, the company is automatically and inadvertently taking those propaganda posts and creating automatically generated videos and posts to encourage the hate groups posting them to share them.
Those auto-generated videos even thanked those responsible for using the site, according to an investigation by the Associated Press. Such automatically made videos are intended to increase engagement by collating people's posts, but appear to do so without checking what might be contained in those posts.
Amid intense criticism, Mark Zuckerberg has praised the success his company has had in finding and removing banned content.
"In areas like terrorism, for al Qaida and Isis-related content, now 99% of the content that we take down in the category our systems flag proactively before anyone sees it," he said last month.
"That's what really good looks like."
Facebook admitted that its systems were not perfect but stressed that its ability to catch the posts was improving.
"After making heavy investments, we are detecting and removing terrorism content at a far higher success rate than even two years ago," the company said in a statement to the Associated Press.
"We don't claim to find everything and we remain vigilant in our efforts against terrorist groups around the world."
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments