Facebook manipulated users' moods in secret experiment

Facebook can use algorithm to make users happy or sad and even scientist that edited the study said she was 'creeped out' by it

Andrew Griffin
Monday 30 June 2014 11:14 BST
Comments
(Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook manipulated the emotions of hundreds of thousands of its users, and found that they would pass on happy or sad emotions, it has said. The experiment, for which researchers did not gain specific consent, has provoked criticism from users with privacy and ethical concerns.

For one week in 2012, Facebook skewed nearly 700,000 users’ news feeds to either be happier or sadder than normal. The experiment found that after the experiment was over users tended to post positive or negative comments according to the skew that was given to their news feed.

The research has provoked distress because of the manipulation involved.

Studies of real world networks show that what the researchers call ‘emotional contagion’ can be transferred through networks. But researchers say that the study is the first evidence that the effect can happen without direct interaction or nonverbal clues.

Anyone who used the English version of Facebook automatically qualified for the experiment, the results of which were published earlier this month. Researchers analysed the words used in posts to automatically decide whether they were likely to be positive or negative, and shifted them up or down according to which group users fell into.

It found that emotions spread across the network, and that friends tended to respond more to negative posts. Users who were exposed to more emotional posts of either type tended to withdraw from posting themselves.

There are on average 1,500 possible stories that could show up on users’ news feeds. Source: Getty Images
There are on average 1,500 possible stories that could show up on users’ news feeds. Source: Getty Images (GettyImages)

The research drew criticism from campaigners over the weekend, who said that the research could be used by Facebook to encourage users to post more and by other agencies such as governments to manipulate the feelings of users in certain countries.

Even the scientist that edited the study had ethical concerns about its methods, she said. "I think it's an open question," Susan Fiske, professor of psychology at Princeton University, told the Atlantic. "It's ethically okay from the regulations perspective, but ethics are kind of social decisions. There's not an absolute answer. And so the level of outrage that appears to be happening suggests that maybe it shouldn't have been done... I'm still thinking about it and I'm a little creeped out too."

Facebook’s ‘Data Use Policy’ — part of the Terms Of Service that every user signs up to when creating an account — reserves the right for Facebook to use information “for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

The researchers said that constituted the informed consent required to conduct the research and made it legal. The study does not say that users were told of their participation in the experiment, which researchers said was conducted by computers so that they saw no posts.

Facebook has said that there are on average 1,500 possible stories that could show up on users’ news feeds at any one time. It uses an algorithm that it says analyses users’ behaviour on the site to determine which of those stories to show.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in