Ofcom to gain powers to police UK social media

A 'duty of care' will be enforced on social media firms to ensure users are protected from harmful content

Wednesday 12 February 2020 10:48 GMT
Comments
'It's a commitment Ian' Nicky Morgan hints at social media legislation on Have I Got News For You

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Communications watchdog Ofcom will be given new powers to hold social media companies accountable for hosting potentially harmful content on their platforms.

The legislation, which is reportedly still being drafted, will place a legal “duty of care” on social media companies such as Facebook, Twitter, YouTube, Snapchat, Instagram, and TikTok to ensure users are protected from illegal or harmful content.

The platforms are largely self-regulating and have their own rules about filtering and taking down unacceptable content on their own terms.

However, concerns about how social media impacts vulnerable people have been raised amid suicides, including that of schoolgirl Molly Russell, who died in 2017.

The 14-year-old was found to have viewed harmful content on Instagram, prompting the platform to ban all images of self-harm or suicide.

According to the Financial Times, which first reported the story, culture secretary Nicky Morgan will announce Ofcom’s widened remit as an Internet watchdog on Wednesday.

In April last year, the government released a joint white paper on online harms that included proposals to introduce an independent regulator and hold bosses of social media companies personally liable for harmful content on their platforms.

It would be up to Ofcom to decide when and how social media firms have breached the “duty of care” and choose how to punish them, which could include issuing “substantial fines, block access to sites and potentially impose liability on individual members of senior management”.

The legislation aims to regulate “illegal activity and content to behaviours which are harmful but no necessarily illegal”.

Ofcom and the Department for Culture, Media and Sport declined to comment on the report.

Critics of the proposal said the legislation could cause damage to freedom of speech, comparing it to “North Korean-style censorship”.

But campaigners for stricter regulation of social media content, particularly for children, have welcomed the “duty of care” model.

Andy Burrows, head of child safety online policy at the National Society for the Prevention of Cruelty to Children (NSPCC), said: “Any regulator will only succeed if it has the power to hit rogue companies hard in the pocket and hold named directors criminally accountable for putting children at risk on their sites.

“Boris Jonson can protect families and support law enforcement by standing firm against some of the world’s most powerful companies.

“To do that it’s imperative that we have a duty of care model that puts the onus on big tech to prevent online harms or answer to an independent regulator.”

Facebook founder Mark Zuckerberg also called for governments to step in with stricter regulation of harmful content and encouraged a more active role in establishing rules for controlling the Internet last March.

He said in an op-ed in the Washington Post: “Lawmakers often tell me we have too much power over speech, and frankly I agree. I’ve come to believe that we shouldn’t make so many important decisions about speech on our own.”

He also said internet firms “should be accountable for enforcing standards on harmful content”.

“It’s impossible to remove all harmful content from the Internet, but when people use dozens of different sharing services – all with their own policies and processes – we need a more standardised approach.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in