The Molly Russell case is yet more evidence of urgent need for social media regulation

Companies do not enforce their own rules on harmful content, and yet suffer no consequences: self-regulation isn’t working

Sunday 27 January 2019 18:36 GMT
Comments
Molly Russell took her own life in 2017
Molly Russell took her own life in 2017 (PA)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Another day, another threat by ministers to legislate unless the social media giants put their house in order. The latest warning, about online material on self-harm which may have contributed to teenage suicides, follows the death of 14-year-old Molly Russell. Her father Ian is convinced she was driven to suicide in November 2017 by material on Pinterest, the online scrapbook, and Instagram, which is owned by Facebook. Pinterest sent Molly an email a month after she died, including images of self-harm and the words: “I can’t tell you how many times I wish I was dead.” Papyrus, a charity working to prevent suicide, is talking to 30 families who believe social media played a part in the suicide of their children.

Pinterest said it was committed to “preventing the spread of potentially harmful content”. Instagram insisted that “for many young people, discussing their mental health journey or connecting with others who have battled similar issues is an important part of their recovery. This is why we don’t remove certain content and instead offer people looking at, or posting it, support when they might need it most”.

The government’s latest warning comes from Matt Hancock, the health secretary, who has told the tech firms in a letter that urgent action is needed “to stop teenagers falling into a suicide trap”. He said: “Let me be clear that we will introduce legislation where needed.” Speaking on the BBC’s Andrew Marr Show on Sunday, he pointed out that ultimately, the government had the power to impose financial penalties or even bans if the firms failed to act. But he emphasised: “It would be far better to do it in concert with the social media companies.”

Mr Hancock knows what he is talking about; he showed a welcome grip on the new media landscape in his previous job at the Department for Digital, Culture, Media and Sport. Unfortunately, the same cannot be said for many other ministers, including his successor Jeremy Wright, a former attorney general with little previous interest in his current brief. One reason why the multi-national tech companies have largely escaped regulation – and paying appropriate levels of tax – is that politicians around the world have been painfully slow to understand the power they wield.

Support free-thinking journalism and attend Independent events

Mr Hancock’s warning is part of a very familiar pattern. Tech firms deny the existence of harmful content. Then, when it is proven beyond doubt, they promise to remove it, while insisting they are merely platforms rather than publishers. They have been pushed into taking some action over material relating to terrorism, hate speech and child abuse. But all too often, those behind the offending content change their methods to keep one step ahead. There is evidence that the companies do not enforce their own rules on harmful content, and yet suffer no consequences. Self-regulation isn’t working.

The companies argue that some judgements about acceptable content are difficult. But their reluctance to take responsibility, change their algorithms and recruit more staff, probably owes more to a desire to protect profit margins than civil liberties.

In theory, effective self-regulation is preferable to state interference. But the time for pussyfooting around is surely over. After the Molly Russell case, how many more examples of the risks of the online world do ministers need? They should implement a sensible proposal by her father. Similar to a German law, it would allow distressing material to be reported to an independent regulator, who would have the power to remove it from social media and websites within 24 hours. If tech companies continued to carry the content, they would be regarded as publishers rather than platforms.

A long-awaited white paper on harmful online content is due shortly. One option under consideration by ministers would set up an arms-length body similar to the Advertising Standards Authority, which would answer to the broadcasting regulator Ofcom. Given the seriousness and urgency of the issue, it would be better for any new body to answer directly to the government. Ministers should also consider imposing a legally enforceable duty of care on the internet giants. The case for them to be covered by proper, statutory regulation has never been stronger.

For confidential support call Samaritans on 116 123.

If you have been affected by this story, you can contact the following organisations for support:

https://www.mind.org.uk/

https://www.beateatingdisorders.org.uk/

http://www.nhs.uk/livewell/mentalhealth

https://www.mentalhealth.org.uk/

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in