Facebook trial asks users to deactivate their accounts for $20 before US election polls
Social media giant says it aims 'to amplify all that is good for democracy, and mitigate against that which is not'
Your support helps us to tell the story
This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.
The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.
Help us keep bring these critical stories to light. Your support makes all the difference.
Facebook is paying users to deactivate their accounts ahead of the US election in an attempt to stop the spread of misinformation and protect democracy.
Screenshots posted by Elizabeth Dwoskin, a technology correspondent for the Washington Post, show that the social media company is offering users a range of payment plans to deactivate their Facebook and Instagram accounts.
“Facebook is now going to pay people to deactivate their IG and FB accounts before Election Day. It’s part of the research experiment announced Monday but WOW. This notice went out this week,” she tweeted.
The screenshots posted by Dwoskin show an Instagram survey, telling users that their account would be deactivated in late September for either one or six weeks.
Users can receive between $10 and $20 per week for deactivating their accounts.
“Please note, your responses below are for research purposes only,” the survey says. "[It will] not affect how much you are offered.”
A Facebook spokesperson confirmed the test on Twitter.
“Anyone who chooses to opt in – whether it’s completing surveys or deactivating FB or IG for a period of time – will be compensated”, said Facebook spokesperson Liz Bourgeois in a tweet.
“This is fairly standard for this type of academic research.”
Earlier this week, Facebook said it was launching a new research partnership to “better understand the impact of Facebook and Instagram on key political attitudes and behaviours during the US 2020 elections”.
“It will examine the impact of how people interact with our products, including content shared in News Feed and across Instagram, and the role of features like content ranking systems” it continued in a blog post.
The company says that its aim is “to continue to amplify all that is good for democracy on social media, and mitigate against that which is not”
It also said it needs to better understand whether social media makes users more polarised as a society.
Facebook had reportedly already shuttered research that showed that its algorithm did make its app more politically polarising, but behaviours necessary to mitigate that would be “anti-growth”.
The social media giant is making numerous other changes ahead of the US election.
They include both technical changes to its apps as well as new policies that it says will help encourage voting, give people authoritative information and reduce the risk of “violence and unrest” in the wake of the results.
Facebook will stop people forwarding mass messages on WhatsApp and Messenger, ban political adverts, and add an “informational label” to posts that try to delegitimise the election.
The integrity of the 2020 election has come into question for a number of reasons. This includes conspiracy theory groups using the platform to try and spread misinformation and disinformation, as well President Trump encouraging voters to commit voter fraud.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments