Parents, internet bans won’t protect your children from social media’s endless loop of harmful content
My daughter Molly died after becoming trapped by an algorithm that served up distressing images – what are social media companies doing about that?
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.This week, the Online Safety Bill returns to parliament – a vital bill that, for various reasons, has been subject to numerous delays and setbacks.
While all legislation must undergo robust scrutiny, the narrative around free speech will no doubt dominate much of today’s discussion about online safety. If we more closely monitor online content, are we in effect stifling free speech?
As one of many bereaved parents who have lost their child to the harmful effects of online content, I would argue that no, we are not stifling free speech. We are seeking to protect our children and young people from aggressive algorithms that relentlessly serve up a vicious cycle of negative content. Content that we know causes significant distress, harm and, tragically in my daughter Molly’s case, death.
Free speech is so often trotted out as justification for the existence of harmful content. But, as a proponent of free speech myself, I believe that it should not be conflated with free-for-all content.
The notion that in our offline world we are at liberty to say whatever we want to say is a false notion to start with. The laws of libel and slander are well established and for good reason – they are there to protect people within society from unsubstantiated claims and, presumably, because it was found to be problematic not to have those laws.
But I find it interesting that when it comes to libel or slander, we are readily accepting of the rules, yet when it comes to personal harm and risk to life, there are people trying to block this legislation, arguing that it is an attempt to censor free speech.
Elon Musk’s talk of free speech, and his re-introduction to Twitter of previously banned accounts, shows the dangers of this naive and idealistic approach. Free speech isn’t black and white, it’s much more nuanced than that.
This recent report from Samaritans states that more than three quarters of people surveyed saw self-harm content online by the time they were 14 – with some being 10 or younger. So it’s clear that something needs to be done in order to protect our children.
So what is this harmful content? Well, it’s not necessarily content that has been published with the aim of causing harm. Sometimes, content showing images of self-harm or encouraging suicide is posted by the user to find help and support. Of course, this doesn’t apply to all such content, and there is a wealth of imagery out there that is posted specifically to cause harm.
Regardless of its primary purpose, however, distressing content shouldn’t be accessible to all. Some platforms argue that if somebody has posted self-harm content as a cry for help, then they shouldn’t take it down. But if in order to help one person towards safety you may have made 100,000 people far less safe, and that can’t be a responsible approach.
The technology exists to flag such content, so why not remove it with improved signposting to helplines, for example? That way, it creates an immediate pathway for support for the person struggling, and reduces the negative impact it would otherwise have when amplified through social media shares and algorithms.
So this isn’t really about freedom of speech – it’s about freedom to live. A child psychiatrist (who spoke at the inquest into the death of my daughter Molly) stated that he was unable to sleep well for weeks after seeing the social media content viewed by Molly before she killed herself. My daughter was only 14.
Molly was trapped by an algorithm that served up distressing images. The coroner concluded that her death was caused by self-harm, depression and “the negative effects of online content”. But this negative content isn’t necessarily sought out.
The way algorithms work is a bit of a mystery. They are multiple and complex, and we know from the Samaritans report that 83 per cent of people who saw harmful content didn’t seek it out – it was suggested to them through features such as Instagram’s “explore” and TikTok’s “for you” pages. The report also found that 76 per cent who viewed self-harm content online went on to harm themselves more severely because of it.
So what we need from social media companies is more accountability and more transparency. All this talk about “town squares” sounds lovely, but there’s a darker side to social media tech the platforms don’t want to discuss.
In fact, not long after Molly’s death, a whistleblower leaked research conducted in-house by Facebook, showing that, among British teens who reported suicidal thoughts, 13 per cent traced the desire to kill themselves back to Instagram.
So while there’s a broader issue with social media content in terms of harms (filters, unrealistic expectations of beauty, etc), there is a very specific and particularly harmful issue that can be quickly dealt with, if companies had the will – or indeed the legal obligation – to do so. And this is images of self-harm and content that encourages suicide.
There are around 200 school-age suicides in the UK every year. Just one is one too many. So while we debate digital free speech as a concept, young people will be viewing content, becoming distressed and physically hurting themselves.
In the meantime, as parents, carers or teachers, we can only do what is within our power – and this is something I’m going to be discussing as part of the free Now and Beyond Festival on 8 February. We shouldn’t panic and put draconian measures in place. I’m sure there may be times when removing a young person’s internet access is the right thing to do but, in the main, we’ll only isolate our children further if that’s our go-to approach.
So we need to remember that there is no blame on our children’s part. They probably didn’t even seek out the content in the first place and if they did, there’s likely to be a vulnerability that needs addressing. Our children need to know that we won’t judge them and that they can come to us whenever something online upsets them. And, we all need to know how to report such content to encourage urgent removal.
We can’t be tech experts, but we can endeavour to keep the lines of communication open between us and our children or pupils. And until effective legislation comes into force, I hope that social media companies realise that they already have blood on their hands. The important concept of free speech should not be hijacked and distorted in such a way to allow online harm to seek out new victims.
Ian Russell is the bereaved father of Molly Russell who died in 2017. He is also a campaigner and founder of The Molly Rose Foundation. Ian will be hosting a free online session on digital dependency for teachers and parents/carers with Carrie Langton (founder of Mumsnet), Manjit Sareen (co-founder, Natterhub), Ukaoma Uche (Papyrus suicide prevention charity) and 16-year-old Kai Leighton (Beyond youth board member) as part of the Now and Beyond Festival. To book your free place, click here. Any schools or colleges wishing to book onto the wider Now and Beyond festival can register free here
If you are experiencing feelings of distress, or are struggling to cope, you can speak to the Samaritans, in confidence, on 116 123 (UK and ROI), email jo@samaritans.org, or visit the Samaritans website to find details of your nearest branch.
If you are based in the USA, and you or someone you know needs mental health assistance right now, call the National Suicide Prevention Helpline on 1-800-273-TALK (8255). This is a free, confidential crisis hotline that is available to everyone 24 hours a day, seven days a week.
If you are in another country, you can go to www.befrienders.org to find a helpline near you.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments