Facebook official says users feel ‘safe and secure’ despite tirade of bad press
Comments come as whistleblower says company knows its product exacerbates conflicts
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facebook’s head of safety practices defended the company’s actions and stated that most of its users feel “safe and secure” on its platforms despite new claims that the company hid research pointing to its role in making conflicts and divisions in society worse.
Antigone Davis told MSNBC’s Stephanie Ruhle on Tuesday that despite claims made by former Facebook data scientist Frances Haugen during a congressional hearing, most users do not feel unsafe as a result of negative or malicious content on Facebook or Instagram.
“Most people really do feel quite safe and secure on our platform,” Ms Davis said.
She contended: “They’re coming back and they’re using our platform because they feel safe and secure.”
She went on to urge Congress to take action and implement regulation that would condition Section 230 protections, which prohibit Facebook and other platforms from being held legally responsible for the content on their platforms because they are not publishers, on the mandate that companies protected by the law follow established “best practices” for content moderation.
Facebook proposed that legislation earlier this year, but many lawmakers on both sides of the aisle want to go further in making the company and others responsible for hateful content and posts that encourage or incite violence, as was evident by the bipartisan atmosphere in Tuesday’s hearing of the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security.
“I think you are going to see a lot of bipartisan concern about this in future hearings,” Sen Roger Wicker, a Republican, predicted on Tuesday.
The company has defended its work to remove hateful content and misinformation in numerous interviews given by high-ranking officials in recent days as it faces a barrage of negative press concerning Ms Haugen’s comments to 60 Minutes and the Senate subcommittee on Tuesday.
For months, Facebook has faced criticism in the media regarding its efforts to battle misinformation leading up to and after the 6 January attack on the US Capitol on issues ranging from the 2020 election to the Covid-19 vaccine. In August, NPR reported that the site’s most-viewed news article was a piece with a headline that misleadingly connected the death of a doctor to the Covid-19 vaccine.
At the time, a company spokesperson told NPR that the article, which was published by a major Florida newspaper, illustrated “just how difficult it is to define misinformation”.
The site, along with Instagram and WhatsApp, suffered a major outage on Monday which prompted Twitter users to joke that a major source of misinformation about the Covid-19 outbreak and vaccines was at least temporarily gone.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments