One in seven teenagers has been exposed to nude-sharing online as Ofcom calls on social media firms to crack down

Social media firms need to stop harmful content being pushed to children by ‘aggressive’ algorithms, internet watchdog says

Holly Bancroft
Social Affairs Correspondent
,Tara Cobham
Wednesday 08 May 2024 00:01 BST
Comments
Esther Ghey calls on phone companies to do more to protect school children

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Social media firms must stop pushing harmful content to children online with “aggressive” algorithms, the online regulator has demanded.

Ofcom has published a new set of guidelines for tech companies to ensure that children are better protected online.

The suggestions include more robust age-checks for young users, and changes to social media algorithms to stop the promotion of harmful material.

It comes as online safety campaign group Internet Matters reveals that one in seven teens under 16 has experienced image-based sexual abuse online.

The mother of murdered schoolgirl Brianna Ghey, Esther Ghey, who has campaigned to protect children from online harm following her daughter’s death, welcomed the new guidelines as “extremely positive” but said they could go a step further.

Mrs Ghey told The Independent: “Sitting down with Ofcom was really positive. I feel they really want to make a change and want this to be as successful as possible. They have got young people’s best interests at heart.”

She said the guidance could be improved by requiring social media companies to allow parents to view content accessed by children, as well as the option of reporting problematic material on a child’s behalf.

She added: “Brianna was accessing self-harm sites and eating disorder pages on Twitter. If she wasn’t able to access this, she probably wouldn’t have been encouraged to harm herself in such a way.”

The new guidance from Ofcom has been written to help companies comply with their duties in the Online Safety Act, which makes platforms legally responsible for keeping people safe online.

Social media giants should have effective age checking tools, Ofcom has said
Social media giants should have effective age checking tools, Ofcom has said (PA)

Under the rules, online media companies will need to assess whether children are likely to access their service and then complete a risk assessment to identify the risks that their products pose to children.

Ofcom has also said that firms must prevent children from seeing the most harmful content relating to suicide, self-harm, eating disorders and pornography. They should also minimise a child’s exposure to serious harms such as violent, hateful or abusive material and bullying content.

Ofcom has set out a number of things they suggest firms do to meet their legal obligations under the Online Safety Act, such as tracking unusual increases in harmful content on their platforms and using a “highly effective age assurance” to make sure children are old enough to use their apps.

Social media firms don’t have to follow the recommendations completely, but if they choose not to they will have to show how they’ve met their legal duties in another way.

Other proposals from the regulator include making sure that children are not recommended increasingly harmful or violent content on their social media feeds. Ofcom said that social media companies use algorithms to determine how content is shown to users based on their characteristics, inferred interests and behaviour. This is the key way that children come across content about suicide, self-harm or eating disorders, the report said.

Ofcom wants social media companies to ensure that this type of content is not shown to children. They also want the rules to make it easier for children to report content while online. Children should be able to accept or decline an invitation to a group chat, disable comments on their own posts, and block or mute other people, Ofcom suggested.

Tech firms also need to get better at moderating content on their platforms and removing it faster when it is flagged as age-inappropriate, Ofcom said.

Esther Ghey, mother of murdered teenager Brianna Ghey, has called for more regulation of social media firms
Esther Ghey, mother of murdered teenager Brianna Ghey, has called for more regulation of social media firms (PA)

Research from Internet Matters showed that 14 per cent of teenagers aged 16 and under said that they had experienced image-based sexual abuse. The findings come from a survey of 1,000 children aged 9-16.

The National Crime Agency recently issued a rare warning to schools about the rising dangers of criminals targetting children on social media and coercing them into sharing nude images. The fraudsters then threaten to share the photos unless money is paid.

Dame Melanie Dawes, Ofcom chief executive, said that for “too long” children’s experiences online “have been blighted by seriously harmful content which they can’t avoid or control”.

Referring to the new code of conduct, Dame Melanie said tech firms “will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an experience that’s right for their age”.

Technology secretary Michelle Donelan called on tech platforms to change their algorithms to keep children safe
Technology secretary Michelle Donelan called on tech platforms to change their algorithms to keep children safe (PA)

She said, once the guidelines are in force, “we won’t hesitate to use our full range of enforcement powers to hold platforms to account”.

Technology secretary Michelle Donelan said that the Ofcom rules were “clear”, adding: “Platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms which too readily mean they come across harmful material online.”

Sir Peter Wanless, CEO at the NSPCC, said the draft codes set “high standards” for tech companies to keep children safe.

Children’s commissioner Dame Rachel de Souza said she hoped that the Ofcom guidelines, along with other protections in the Online Safety Act, “will mark a significant step forward in the ongoing effort to safeguard children online”.

This is the first draft of an online code to protect children that Ofcom has produced. It will now be consulted on and a final proposal will be published in spring 2025.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in