Ofcom to be given new powers to regulate ‘harmful’ social media content

The watchdog will be able to issue fines and ‘suspend’ websites if they fail to comply

Sarah Jones
Monday 12 August 2019 20:36 BST
Comments
Health secretary Matt Hancock: 'we must legislate' social media companies

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Ofcom is to be given the power to fine social media companies in a bid to protect children from “harmful” online content.

The government proposal, which is currently under consultation, will allow the watchdog – which already polices TV, radio and broadband in the UK – to issue fines against platforms and websites if it believes they have failed to protect users from seeing harmful videos such as those depicting violence or child abuse.

“The directive proposed a number of appropriate measures to protect minors and the general public from harmful content,” a spokesperson for the Department for Digital, Culture, Media and Sport (DCMS) said.

“The government has proposed that Ofcom is given interim powers to regulate video-sharing platform services and ensure they comply with minimum standards set out in the Audiovisual Media Services Directive (AVMSD) by the transposition deadline,19 September 2020. We are currently consulting on this approach.“

The move follows the publication of the Online Harms White Paper in April, which called for new legislation to make social media companies responsible for protecting their users.

The DCMS said the move would allow the UK to meet its obligations to the EU regarding online safety, which call for sites to establish strict age verification checks and parental controls to ensure young children are not exposed to harmful content.

If any platforms fail to meet the requirements, Ofcom will be able to issue fines of £250,000 or an amount worth up to five per cent of a company’s revenues.

It will also have the power to “suspend” or “restrict” the tech giants’ services in the UK if they fail to comply with enforcement measures.

The DCMS has said that the new powers will be given to Ofcom on an “interim basis” but admitted it could become permanent.

“These new rules are an important first step in regulating video-sharing online, and we’ll work closely with the government to implement them,” a spokesperson for Ofcom commented.

“We also support plans to go further and legislate for a wider set of protections, including a duty of care for online companies towards their users.”

Child safety charity the NSPCC said the measures were an important step in holding social media and internet firms to account.

Andy Burrows, the charity’s head of child safety online policy said: “This directive is an important opportunity to regulate social networks with user-generated video or livestream functions as early as next year.

“The immediacy of livestreaming can make children more vulnerable to being coerced by abusers, who may capture the footage, share it and use it as blackmail.

“The directive gives the UK a chance to introduce tough measures on tech firms that have their European headquarters here.

“Crucially, this is a real chance to bring in legislative protections ahead of the forthcoming Online Harms Bill and to finally hold sites to account if they put children at risk.”

In March, the father of Molly Russell urged the government to introduce regulation on social media platforms in response to his 14-year-old daughter taking her own life. She was found to have viewed content related to depression and suicide on Instagram before her death.

Support free-thinking journalism and attend Independent events

Former prime minister Theresa May said the proposals were a result of social media companies’ failure to self-regulate.

“The internet can be brilliant at connecting people across the world – but for too long these companies have not done enough to protect users, especially children and young people, from harmful content,” May said at the time.

“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in