Facebook defends secretly manipulating users: Experiments 'improved service'

Social network researched the power 'emotional contagion' by altering users' News Feeds to display either mostly positive or negative emotional content

James Vincent
Tuesday 01 July 2014 07:45 BST
Comments

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

Facebook has responded to criticism it has received for manipulating users’ moods for a psychological experiment by saying the research was undertaken to “improve our service and to make the content people see on Facebook as relevant and engaging as possible.”

The social network has been strongly denounced by users after it published a paper detailing how it was able to affect individuals’ moods by altering the amount of positive or negative content that appeared in their News Feed.

Lawyers and technology experts have also expressed worries that the same methods could be used to manipulate populations for political gain while academics have criticized Facebook failing to acquire the “informed consent” necessary for subjects of human experiments.

In a statement given to The Independent, a spokesperson for the social network said: “This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

Writing in a blog post on the site, one of the study’s co-authors Adam Kramer said: “Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my co-authors and I are very sorry for the way the paper described the research and any anxiety it caused.”

Some 689,000 users were drafted into the experiment which manipulated a "small percentage" of content. The study concluded that: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

On Sunday evening Jim Sheridan MP, a member of the Commons media select committee, described the experiment as ‘thought-control’ and called for an investigation into emotional manipulation by social networks.

“This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," Sheridan was reported as saying by The Guardian. “They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas.”

Clay Johnson, an American technologist best known for helping to manage Barack Obama’s 2008 presidential campaign, commented that the experiment was particularly disturbing in the light of the Snowden revelations and the recently-uncovered US plot to create a ‘Cuban Twitter’ to foment unrest in the communist nation.

“We need to rethink how society relates to these networks,” wrote Johnson on Twitter. “For instance, could Mark Zuckerberg swing an election by promoting Upworthy posts two weeks beforehand? Should that be legal? Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal?”

Other commentators have suggested that the study is not completely unprecedented, with technology journalist Christopher Mims tweeting: "I've discovered Facebook is involved in an even more pernicious, multi billion dollar conspiracy to manipulate users: Advertising."

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in