Molly Russell death: Coroner suggests separate platforms for adults and children

The 14-year-old died in November 2017 after viewing suicide and self-harm content online.

Josh Payne
Friday 14 October 2022 10:52 BST
Undated family handout file photo of Molly Russell. A senior coroner at North London Coroner’s Court has concluded the schoolgirl died from “negative effects of online content”. Coroner Andrew Walker said online material viewed by the 14-year-old “was not safe” and “shouldn’t have been available for a child to see”. Concluding it would not be “safe” to rule Molly’s cause of death was suicide, Mr Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content”. Issue date: Friday September 30, 2022.
Undated family handout file photo of Molly Russell. A senior coroner at North London Coroner’s Court has concluded the schoolgirl died from “negative effects of online content”. Coroner Andrew Walker said online material viewed by the 14-year-old “was not safe” and “shouldn’t have been available for a child to see”. Concluding it would not be “safe” to rule Molly’s cause of death was suicide, Mr Walker said the teenager “died from an act of self-harm while suffering depression and the negative effects of online content”. Issue date: Friday September 30, 2022. (PA Media)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The father of schoolgirl Molly Russell has urged social media companies not to “drag their feet waiting for legislation”, as a coroner issued recommendations including separate platforms for adults and children.

Coroner Andrew Walker sent a Prevention of Future Deaths report (PFD) to businesses such as Meta, Pinterest, Twitter and Snapchat as well as the UK Government on Thursday, in which he urged a review of the algorithms used by the sites to provide content.

The 14-year-old, from Harrow in north-west London, ended her life in November 2017 after viewing suicide and self-harm content online, prompting her family to campaign for better internet safety.

Although regulation would be a matter for Government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above

Coroner Andrew Walker

In his report, Mr Walker identified six areas of concern that arose during the inquest into Molly’s death, including the separation of content for adults and children.

The coroner also voiced concerns over age verification when signing up to the platforms, content not being controlled so as to be age-specific, and algorithms being used to provide content together with adverts.

Other issues included in the report were the lack of access or control for parents and guardians and the absence of capability to link a child’s account to a parent or guardian’s account.

At the inquest held at North London Coroner’s Court last month, the coroner concluded Molly died while suffering from the “negative effects of online content”.

The inquest was told the teenager accessed material from the “ghetto of the online world” before her death, with her family arguing sites such as Pinterest and Instagram recommended accounts or posts that “promoted” suicide and self-harm.

In her evidence, Meta executive Elizabeth Lagone said she believed posts seen by Molly, which her family say “encouraged” suicide, were safe.

Pinterest’s Judson Hoffman told the inquest the site was “not safe” when the schoolgirl used it.

In light of the concerns raised, Mr Walker recommended the Government considered reviewing the provision of internet platforms to children in the PFD.

Other areas highlighted for review included separate platforms for adults and children, age verification before joining a platform, provision of age specific content, and the use of algorithms to provide content.

The coroner also recommended the Government review the use of advertising and parental, guardian or carer control including access to material viewed by a child, and retention of material viewed by a child.

Mr Walker’s report said: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content with particular regard to the above.

“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.

“Although regulation would be a matter for Government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above.”

Mr Walker said he believed action should be taken in order to prevent future deaths, adding: “I believe you and/or your organisation have the power to take such action.”

Reacting to the recommendations, Molly’s father Ian Russell said: “We welcome this report by the coroner, which echoes our concerns about the online dangers Molly was exposed to, and pushed towards by the platforms’ algorithms.

“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users.

“They should think long and hard about whether their platforms are suitable for young people at all.

“The Government must also act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content, and that platforms and their senior managers face strong sanctions if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly.

“I hope this will be implemented swiftly through the Online Safety Bill which must be passed as soon as possible.”

In their response to the PFD report, Instagram’s parent company Meta said they agreed “regulation is needed”.

The social media giant said it was “reviewing” the coroner’s report, adding: “We don’t allow content that promotes suicide or self-harm, and we find 98% of the content we take action on before it’s reported to us.

“We’ll continue working hard, in collaboration with experts, teens and parents, so we can keep improving.”

Pinterest also issued a statement in reaction to the report, which said: “Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.”

Meta, Pinterest, Twitter and Snapchat all have 56 days to respond with a timetable of action they propose to take or explain why no action is proposed.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in