Facebook whistleblower says company spreads hate speech for profit

Facebook’s internal research found it’s ‘easier to inspire people to anger than it is to other emotions’

Vishwam Sankaran
Monday 04 October 2021 15:08 BST
Comments
Frances Haugen during her CBS News interview with Scott Pelley
Frances Haugen during her CBS News interview with Scott Pelley (CBS News)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The whistleblower behind a major leak of internal documents at Facebook says the social media giant has always prioritised its own profits over the public good.

Frances Haugen, a data scientist and a former product manager on Facebook’s civic misinformation team, revealed herself in a 60 Minutes interview on Sunday as the woman who anonymously leaked documents of the company’s research to The Wall Street Journal and the US Congress.

Ms Haugen said the leaks showed how Facebook magnifies hate and misinformation by prioritising profits.

“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” Ms Haugen told host Scott Pelley.

“And Facebook, over and over again, chose to optimise for its own interests, like making more money,” she added.

Ms Haugen said while she had worked at a number of companies, including Google and Pinterest, “it was substantially worse at Facebook” because of the social media giant’s desire to put its profits over the welfare of its users.

Facebook’s main problem was a change the company made in 2018 to its algorithms, which decide what users see on the platform’s news feed, according to the data scientist and whistleblower.

“You might see only 100 pieces of content if you sit and scroll on for, you know, five minutes,” Ms Haugen said, adding that Facebook’s algorithms choose these pieces from thousands of options it could show users.

She said the algorithms are optimised for content that gets engagement – or reactions – from users.

“But its own research is showing that content that is hateful, that is divisive, that is polarising, it’s easier to inspire people to anger than it is to other emotions,” said Ms Haugen.

If the company changed its algorithm to be on the safe side, however, people would tend to spend less time on the platform, clicking fewer ads and bringing it less money, she said.

“Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume,” said the data scientist.

“Facebook, over and over again, has shown it chooses profit over safety. It is subsidising, it is paying for its profits with our safety,” said Ms Haugen, adding that the company understood these dangers during the 2020 US elections.

The company turned on safety systems temporarily during the election period to reduce misinformation but “as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritise growth over safety,” she alleged.

The social media giant said some of the safety systems remained, but after the election, some used Facebook to organise the 6 January US Capitol riots.

Facebook’s vice president of global affairs Nick Clegg sent an internal memo on Friday responding to criticism against the company and also appeared on CNN on Sunday to make many of the same arguments.

“We understand the piece is likely to assert that we contribute to polarisation in the United States, and suggest that the extraordinary steps we took for the 2020 elections were relaxed too soon and contributed to the horrific of events of 6 January in the Capitol,” the memo reportedly said.

“But what evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarisation,” it added.

“I think the assertion that January 6th can be explained because of social media, I just think that’s ludicrous,” Mr Clegg told CNN.

In an email response to The Independent, Facebook said in the two years leading up to the US elections, Facebook made large investments with more than 40 teams across the company, and over 35,000 people working on safety and security.

The company said it took into account specific on-platforms signals and information from regular engagement with law enforcement to phase in and adjust additional emergency measures “before, during, and after the election”.

“It is wrong to claim that these steps were the reason for January 6th – the measures we did need remained in place through February, and some like not recommending new, civic, or political groups remain in place to this day,” said Lena Pietsch, director of policy communications at Facebook.

“Protecting our community is more important than maximizing our profits. To say we turn a blind eye to feedback ignores these investments, including the 40,000 people working on safety and security at Facebook and our investment of $13 billion since 2016,” added Ms Pietsch.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in