Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook asks if you know someone ‘becoming an extremist’ in new prompt test

‘Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others’, one version of the message reads

Adam Smith
Friday 02 July 2021 10:55 BST
Comments
Facebook Antitrust Lawsuits
Facebook Antitrust Lawsuits (Copyright 2020 The Associated Press. All rights reserved)
Leer en Español

Facebook is testing a prompt asks users whether they are “concerned that someone you know is becoming an extremist”.

The new message says: “We care about preventing extremism on Facebook. Others in your situation have received confidential support.

“Hear stories and get advice from people who escaped violent extremist groups”. Underneath that message is a blue “Get Support” button.

Another version of the message reads: "Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others."

Speaking to CNN, Facebook said that this is part of a test the social media company is running as part of its Redirect Initiative, aimed at fighting extremism.

"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,"Facebook spokesperson Andy Stone said.

"We are partnering with NGOs and academic experts in this space and hope to have more to share in the future." Facebook shared the same statement with The Independent but attributed it to an unnamed “Facebook company spokesperson”.

Facebook has often been criticised over claims of facilitating extremism on its platforms. A report by Avaaz, a nonprofit advocacy group that says it seeks to protect democracies from misinformation, claimed that Facebook allowed groups to glorify violence during the 2020 election and in the weeks leading up to the Capitol Hill insurrection attempt on 6 January.

Facebook’s algorithm also exacerbated divisiveness, according to leaked research from inside the social media company, as reported by the Wall Street Journal. Facebook reportedly ended research into stopping the platform being so polarising for fears that it would unfairly target right-wing users. “Our recommendation systems grow the problem,” one presentation said.

In response to that report, Facebook published a blog post saying that the newspaper "wilfully ignored critical facts that undermined its narrative" which, the company says, includes changes to the News Feed, limiting the reach of Pages and Groups that breach Facebook’s standards or share fake news, combating hate speech and misinformation, and "building a robust Integrity Team."

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in