Ofcom’s new online harms rules for social media firms disappoint campaigners

Platforms now have until March to comply with the new Online Safety Act rules or face large fines.

Martyn Landi
Monday 16 December 2024 08:39 GMT
Online safety charity Internet Matters’ survey found parents have concerns about children’s digital habits (Alamy/PA)
Online safety charity Internet Matters’ survey found parents have concerns about children’s digital habits (Alamy/PA)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The first set of new online safety rules legally requiring social media and other sites to take action against illegal content have been published by Ofcom.

The regulator said platforms now have three months to assess the risk of their users encountering illegal content and implement safety measures to mitigate those risks, or face enforcement action if they fail to comply with their new duties once they come into force.

The first set of rules focuses on illegal harms – such as terror, hate, fraud, child sexual abuse and encouraging suicide – but one safety charity has criticised the publication, saying it will allow “preventable illegal harm to continue to flourish”.

We are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold

Andy Burrows, The Molly Rose Foundation

Ofcom has the power to fine firms up to £18 million or 10% of their qualifying global turnover under the Online Safety Act – whichever is greater – and in very serious cases can apply for sites to be blocked in the UK.

However, The Molly Rose Foundation, which was set up by the family of Molly Russell, who ended her life when she was 14 in 2017 after viewing suicide content on social media, said it was “astonished” and “disappointed” at Ofcom’s first set of codes.

“Ofcom’s task was to move fast and fix things but instead of setting an ambitious precedent these initial measures will mean preventable illegal harm can continue to flourish,” the charity’s chief executive Andy Burrows said.

“While we will analyse the codes in full, we are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold.

The safety spotlight is now firmly on tech firms and it’s time for them to act

Dame Melanie Dawes, Ofcom

“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life.

“Today makes clear that there are deep structural issues with the Online Safety Act. The Government must commit to fixing and strengthening the regime without delay.”

Ofcom chief executive, Dame Melanie Dawes, said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.

“The safety spotlight is now firmly on tech firms and it’s time for them to act.

“We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.

“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”

Technology Secretary Peter Kyle said the publication of the first set of codes under the Online Safety Act was a “significant step” in making online spaces safer.

“This Government is determined to build a safer online world where people can access its immense benefits and opportunities without being exposed to a lawless environment of harmful content.

“Today we have taken a significant step on this journey.

“Ofcom’s illegal content codes are a material step-change in online safety meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world.

“If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites.

“These laws mark a fundamental reset in society’s expectations of technology companies.

“I expect them to deliver and will be watching closely to make sure they do.”

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in