The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission. 

Nick Clegg wrote 5000 words about your Facebook news feed - but here’s everything that was missing

Adam Smith
Thursday 01 April 2021 13:21 BST
Comments
(AFP via Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

“The internet needs new rules”, Nick Clegg, former deputy prime minister and now VP of global affairs writes. “That starts — but by no means ends — with putting people, not machines, more firmly in charge.”

Mr Clegg made this declaration in a blog post whereby he seeks to demystify the relationship between Facebook and the user, as the social media company announces new tools by which users can control how their News Feed is ordered.

According to Mr Clegg, the social media site’s News Feed is “personalised to you”. The community standards are managed by “Facebook’s decision makers who ultimately decide what content is acceptable on the platform”.

Polarising and extreme content, or as Mr Clegg describes it, “emotive language and arresting imagery” merely “grab people’s attention” because it is “human nature”. For content that could cause harm, Facebook deprioritises it, he said.

“But Facebook’s systems are not designed to reward provocative content. In fact, key parts of those systems are designed to do just the opposite. Facebook reduces the distribution of many types of content … because they are sensational, misleading, gratuitously solicit engagement, or are found to be false by our independent fact checking partners”, Mr Clegg writes.

Facebook has not always worked this way. The company’s own research, as explained by Mark Zuckerberg, states that no matter where the company draws the line regarding its community standards, sensationalist content is more likely to be shared.

It was only in 2018, when the CEO announced a major change to the algorithm, that Facebook began artificially suppressing the distribution of content that comes close to violating its policies.

Read more:

It has, it seems, not always been successful. Following the results of the US presidential election, the social media company made a temporary change to its algorithm that would promote authoritative news sources over partisan ones. The results, the New York Times reports, were that publishers like CNN, NPR, and the New York Times itself saw traffic rise, while Breitbart and Occupy Democrats saw it fall.

Despite staff reportedly asking for the change to remain permanent, Facebook eventually reverted its previous algorithm. One would think that keeping an algorithm that promoted better news would be a good thing, but it appears Facebook disagreed.

Citing a Harvard study, Mr Clegg argues that because disinformation was spread via the mass media - more specifically, by their failure to fact-check Mr Trump and allowing his own disinformation to pass through their channels - the company did not need to keep the “nicer news feed”. The study says that social media played a “secondary role”.

Unfortunately for Mr Clegg, the relationship between Facebook and the media is more complicated than that. The very study Mr Clegg cites says that 60 per cent of Republicans whose major source of news was Fox News thought voter fraud was a major issue despite this being misinformation pushed by the Mr Trump himself. Fox News, incidentally, had the highest interaction rate on Facebook, a title it has held for years. Fox News also promoted the myth of election fraud, so much so it had to run a debunker on itself.

For Democrats, whose main sources of news were CNN, NPR, or the New York Times, that number drops significantly. As little as four per cent believed the misinformation. As such, it seems obvious that one would want to prioritise the media that, despite their failings, did not actively promote myths. Facebook disagrees.

The results of a continued spread of misinformation about the integrity of the election had serious consequences - both on and off Facebook. Earlier this year, Facebook had to clamp down on Groups, which Mr Zuckerberg had called the new “heart of the app” in January 2021, with theWall Street Journal citing internal documents that suggested misinformation and calls to violence were spreading rapidly on the platform in the build-up to the insurrection attempt on 6 January.

“We need to do something to stop these conversations from happening and growing as quickly as they do,” the researchers wrote, suggesting measures to slow the growth of Groups. “Our existing integrity systems aren’t addressing these issues.”

One “Stop the Steal” group, Reuters reports, grew to 365,000 members in under a day, although Facebook’s COO Sheryl Sandberg claimed that the attempted violation of democracy was “largely organized on [other] platforms”. Still, just two weeks later Facebook had to announce that it would no longer algorithmically recommend political groups.

Mr Clegg still contends that human nature is the problem. “Consider, for example, the presence of bad and polarizing content on private messaging apps — iMessage, Signal, Telegram, WhatsApp — used by billions of people around the world”, he writes.

“None of those apps deploy content or ranking algorithms. It’s just humans talking to humans without any machine getting in the way … We need to look at ourselves in the mirror, and not wrap ourselves in the false comfort that we have simply been manipulated by machines all along.”

It is true that messaging platforms – including those owned by Facebook – have been used to spread misinformation. But they remain unable to broadcast it as quickly or as widely as the news feed or the algorithm that powers it, and the biggest dangers have come about when messaging apps become more like social networks such as Facebook.

WhatsApp, a subsidiary of Facebook, had to drastically limit its forwarding function due to its role in causing violence in India and a genocide in Mynamar in 2018 - dropping the limit of how many people could receive a forwarded message from 250 to just 20. The ability for WhatsApp users to share content on a massive scale caused serious harm, due to a design choice made by Facebook that fed into existing cultural issues.

Mr Clegg also suggests in the blog post that the company and its algorithm would not push problematic content for revenue reasons, which follows advertisers’ boycott the platform due to the company’s inaction on tackling racism spread via its services. “The vast majority of Facebook’s revenue comes from advertising. Advertisers don’t want their brands and products displayed next to extreme or hateful content”, he writes. “The protest showed that Facebook’s financial self-interest is to reduce it, and certainly not to encourage it or optimize for it.”

But when that boycott flared up, chief executive Mark Zuckerberg specifically said that advertisers’ concerns would not affect policy decisions on what the algorithm shows.

“We’re not gonna change our policies or approach on anything because of a threat to a small percent of our revenue, or to any percent of our revenue,” Mr Zuckerberg said in a leaked transcript. “My guess is that all these advertisers will be back on the platform soon enough.” The company later confirmed the transcript: “We make policy changes based on principles, not revenue pressures,” a spokesperson said. It remains unclear exactly what those principles are.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in