Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

What are the new social media algorithm rules as Ofcom cracks down on harmful content?

The new rules include more robust age verification processes to stop children accessing harmful material

Athena Stavrou
Wednesday 08 May 2024 16:52 BST
Comments
TIKTOK-EEUU DEMANDA
TIKTOK-EEUU DEMANDA (AP)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Social media platforms have been told to take action to stop pushing harmful content to children with algorithms, Ofcom has said.

The online regulator published its draft children’s safety codes of practice on Wednesday to set out how it expects platforms to meet their new legal responsibilities under the Online Safety Act.

The 40 suggestions include more robust age verification processes to stop children accessing harmful material, as well as ensuring that their recommendation algorithms – such as “For You” pages – do not serve dangerous or potentially harmful content to children.

The new rules are set to come into force towards the end of this year with large fines and possible penalties in place for those found to be in breach.

New rules come as a result of pressure from parents and campaigners who are raising awareness of the danger social media poses to children - with devastating consequences for some.

Here’s everything we know:

Social media giants should have effective age checking tools, internet watchdog Ofcom has said
Social media giants should have effective age checking tools, internet watchdog Ofcom has said (PA)

What are the new rules?

The latest codes include more than 40 practical measures which Ofcom says will demand a step-change from tech firms by compelling safer design and operating practices from the biggest sites.

The suggestions include more robust age-checks for young users, and changes to social media algorithms to stop the promotion of harmful material.

Under the proposals, platforms that can be accessed by children and have a higher risk of harmful content appearing must configure their algorithms to filter out the most harmful content from children’s feeds, and reduce the visibility and prominence of other lower risk, but still potentially harmful, material.

The draft codes also require firms to have content moderation systems and processes in place, and ensure that swift action is taken against harmful content, with search engines expected to have a “safe search” option for use by children.

Social media firms don’t have to follow the recommendations completely, but if they choose not to they will have to show how they’ve met their legal duties in another way.

Ofcom chief executive, Dame Melanie Dawes, said: “We want children to enjoy life online. But, for too long, their experiences have been blighted by seriously harmful content which they can’t avoid or control. Many parents share feelings of frustration and worry about how to keep their children safe. That must change.”

Ofcom chief Dame Melanie Dawes said that for “too long” children’s experiences online “have been blighted by seriously harmful content which they can’t avoid or control”. (Ofcom/PA)
Ofcom chief Dame Melanie Dawes said that for “too long” children’s experiences online “have been blighted by seriously harmful content which they can’t avoid or control”. (Ofcom/PA) (PA Media)

Why were they introduced?

Parents and campaign groups have been rallying for reform to protect children on social media platforms.

The proposals comes as research from Internet Matters showed that 14 per cent of teenagers aged 16 and under said that they had experienced image-based sexual abuse. The findings come from a survey of 1,000 children aged 9-16.

The National Crime Agency recently issued a rare warning to schools about the rising dangers of criminals targetting children on social media and coercing them into sharing nude images. The fraudsters then threaten to share the photos unless money is paid.

Ms Dawes said that for “too long” children’s experiences online “have been blighted by seriously harmful content which they can’t avoid or control”.

Esther Ghey, mother of murdered teenager Brianna Ghey, has called for more regulation of social media firms
Esther Ghey, mother of murdered teenager Brianna Ghey, has called for more regulation of social media firms (The Independent)

What have campaigners said?

Campaigners and bereaved parents have warned that the Online Safety Act does not yet go far enough to protect children on social media.

The mother of murdered schoolgirl Brianna Ghey, Esther Ghey, who has campaigned to protect children from online harm following her daughter’s death, welcomed the new guidelines as “extremely positive” but said they could go a step further.

Mrs Ghey told The Independent: “Brianna was accessing self-harm sites and eating disorder pages on Twitter. If she wasn’t able to access this, she probably wouldn’t have been encouraged to harm herself in such a way.”

She said the guidance could be improved by requiring social media companies to allow parents to view content accessed by children, as well as the option of reporting problematic material on a child’s behalf.

Likewise, child online safety campaigner Ian Russell, the father of 14-year-old Molly Russell who took her own life in November 2017 after viewing harmful material on social media, said more still needed to be done to protect young people from online harms.

He said tech firms were “buying as much time as they can” by claiming they were waiting for Ofcom to publish all its codes before making changes to their platforms.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in