Mark Zuckerberg: Facebook hiring 3,000 to stop 'heartbreaking' violent videos
'If we're going to build a safe community, we need to respond quickly'
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Mark Zuckerberg has announced a range of new measures designed to help Facebook remove unacceptable content more quickly.
The social network has come under heavy fire over recent weeks, after videos of two murders were uploaded to the site and watched by hundreds of thousands of users before eventually being taken down.
One of those videos, which showed an 11-month-old child being killed, was on the site for around 24 hours before moderators acted.
The other, which showed the murder of Robert Godwin Snr., was only reported by a user over an hour and a half after it was uploaded.
Mr Zuckerberg has revealed that Facebook will attempt to tackle the issue by adding 3,000 people to its community operations team over the next 12 months.
The team is currently made up of 4,500 members of staff, who have the mammoth task of moderating the billions of posts that go up every day.
“Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later,” Mr Zuckerberg wrote in a Facebook update. “It's heartbreaking, and I've been reflecting on how we can do better for our community.
“If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down.”
Facebook’s moderators are assisted by algorithms, which are designed to cut through the noise to filter out innocent updates, allowing staff to focus on a much more manageable sample of data.
Unfortunately, it isn’t always effective.
“In addition to investing in more people, we're also building better tools to keep our community safe,” added Mr Zuckerberg. “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.
“This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.
“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments