Instagram still has much more to do to stop suicide and self-harm posts on app, boss admits

'We need to do everything we can to keep the most vulnerable people who use our platform'

Andrew Griffin
Monday 04 February 2019 14:23 GMT
Comments
(Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Instagram's boss has admitted the company has lots more to do on how it deals with self-harm and suicide.

Adam Mosseri, who heads up the Facebook-owned platform since its founders left, said Instagram is running a comprehensive review into its policies about how it treats such content, which regularly appears on Instagram feeds.

The company will also ad "sensitivity screens" that will alert people to what they are about to see and confirm they want to actually look at such posts – at the moment, users can simply stumble across such posts in their own feed. That is part of a wider plan to make the posts harder to find.

Writing in the Daily Telegraph, Mr Mosseri said the recent case of 14-year-old Molly Russell, whose father said she took her own life after looking at self-harm posts, had left him "deeply moved".

"We need to do everything we can to keep the most vulnerable people who use our platform safe. To be very clear, we do not allow posts that promote or encourage suicide or self-harm," he said.

"We rely heavily on our community to report this content, and remove it as soon as it's found. The bottom line is we do not yet find enough of these images before they're seen by other people."

His comments come as social media and technology firms face increasing scrutiny over their practices.

Health Secretary Matt Hancock said last week legislation may be needed to police disturbing content on social media, and separate reports by the House of Commons Science and Technology Committee and the Children's Commissioner for England called on social media to take more responsibility for the content on their platforms.

Mr Mosseri said Instagram was investing in technology to better identify sensitive images and would also begin using sensitivity screens which hide images from view until users actively choose to look at them.

"Starting this week we will be applying sensitivity screens to all content we review that contains cutting, as we still allow people to share that they are struggling even if that content no longer shows up in search, hashtags or account recommendations. These images will not be immediately visible, which will make it more difficult for people to see them," he said.

"We want to better support people who post images indicating they might be struggling with self-harm or suicide. We already offer resources to people who search for hashtags, but we are working on more ways to help, such as connecting them with organisations we work with like Papyrus and Samaritans.

"We have worked with external experts for years to develop and refine our policies. One important piece of advice is that creating safe spaces for young people to talk about their mental health online is essential. Young people have also told us that this is important, and that when the space is safe, the therapeutic benefits are positive."

He said the site did not want to "stigmatise mental health" by deleting images which reflect the issues people were struggling with, but would not stop recommending them in searches, via hashtags or the app's Explore tab.

"Suicide and self-harm are deeply complex and challenging issues that raise difficult questions for experts, governments and platforms like ours," Mr Mosseri wrote.

"How do we balance supporting people seeking help and protecting the wider community? Do we allow people to post this content they say helps them or remove it in case others find it? This week we are meeting experts and academics, including Samaritans, Papyrus and Save.org, to talk through how we answer these questions. We are committed to publicly sharing what we learn. We deeply want to get this right and we will do everything we can to make that happen."

Additional reporting by agencies

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in