Instagram to ban graphic self-harm images after suicide of school girl Molly Russell
Some posts might still be visible on the site
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Instagram has committed to remove all images of graphic self-harm, the platform’s boss has said.
The move comes amid intense criticism of the Facebook-owned service, which has been accused of not doing enough to removes posts that glorify self-harm and suicide in the wake of the death of schoolgirl Molly Russell.
Instagram boss Adam Mosseri said the company had previously allowed people to share graphic images of self-harm. But it will be changing its policy to ban those posts as well as looking to find and remove posts.
“Historically, we have allowed content related to self-harm that’s ‘admission’ because people sometimes need to tell their story – but we haven’t allowed anything that promoted self-harm,” he told the BBC.
“But, moving forward, we’re going to change our policy to not allow any graphic images of self-harm.”
At the moment, the site relies on users to report problem images, which Instagram may choose to take down or not. It will now work on new technologies to spot such images, he said.
But some images that relate to self-harm might still be visible on the site, he said.
“I might have an image of a scar and say, ‘I’m 30 days clean,’ and that’s an important way to tell my story,” Mr Mosseri said.
“That kind of content can still live on the site but the next change is that it won’t show up in any recommendation services so it will be harder to find.
“It won’t be in search, it won’t be in hashtags, it won’t be in recommendations.”
Mr Mosseri refused to say he would resign if the company was still showing self-harm images in six months. But he said he would have “a long thought” about how well he is doing his job.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments