Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Social media sites ‘not doing enough’ to prevent spread of self-harm content

A new report from Samaritans says platforms are failing to protect their users, particularly younger people.

Martyn Landi
Tuesday 08 November 2022 00:01 GMT
The survey showed that 76% of those who had seen self-harm or suicide content said they went on to harm themselves more severely because of it (PA)
The survey showed that 76% of those who had seen self-harm or suicide content said they went on to harm themselves more severely because of it (PA) (PA Wire)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Social media platforms are not doing enough to prevent users, in particular young people, from seeing and being affected by self-harm and suicide content, a new study says.

A survey of social media users by Samaritans and Swansea University found that 83% of those asked had been recommended self-harm content without searching for it.

Although the researchers said the study used a social media campaign to encourage people to take an online survey on the issue, and noted that this may have impacted the outcome as a result as it was more likely that people with experience of self-harm and suicide would have chosen to take part, they said it still highlighted how damaging such content to be, particularly to vulnerable young people.

The survey showed that 76% of those who had seen self-harm or suicide content said they went on to harm themselves more severely because of it.

In addition, it found that three-quarters of those who took part had seen self-harm content online for the first time aged 14 or younger, with the charity urging the platforms to do more now to protect their users rather than waiting for regulation to be forced upon them.

People are not in control of what they want to see because sites aren’t making changes to stop this content being pushed to them and that is dangerous

Samaritans chief executive Julie Bentley

The vast majority of those asked (88%) said they wanted more control over filtering the content they see on social media, while 83% said they believe that more specific trigger warnings, such as using terms like self-harm or suicide within content warnings, would be helpful to them.

“We would never stand for people pushing this kind of material uninvited through our letterbox, so why should we accept it happening online,” Samaritans chief executive Julie Bentley said.

“Social media sites are simply not doing enough to protect people from seeing clearly harmful content and they need to take it more seriously.

People are not in control of what they want to see because sites aren’t making changes to stop this content being pushed to them and that is dangerous.

“Sites need to put in more controls, as well as better signposting and improved age restrictions.

People want more control over the content they view, ways to ensure children meet age requirements and co-produced safety features and policies

Professor Ann John, Swansea University

“The Online Safety Bill must become law as soon as possible to reduce access to all harmful content across all sites regardless of their size and, critically, make sure that this is tackled for both children and adults.

“We’re waiting anxiously for the Bill to return to the House of Commons after numerous delays, but there is nothing stopping platforms from making changes now.

“The internet moves much quicker than any legislation so platforms shouldn’t wait for this to become law before making vital changes that could save lives.”

Professor Ann John, from Swansea University and co-lead on the study, said more research on the subject was needed to get a clearer picture of the national impact of such content but said it was clearly damaging to many people.

“While our study cannot claim to represent the whole population’s experience of this content since only those interested would have responded to our requests, many of the themes point clearly to ways social media platforms can improve,” she said.

“People want more control over the content they view, ways to ensure children meet age requirements and co-produced safety features and policies. That all seems very doable.”

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in