Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Social media companies need ‘legal duty of care’ to protect young users, MPs say

'Self-regulation will no longer suffice,' Science and Technology Committee says in call for strict new legislation

Alex Matthews-King
Health Correspondent
Thursday 31 January 2019 01:40 GMT
Comments
Rise of social media and smartphones means constant exposure to cyberbullying and body image pressures
Rise of social media and smartphones means constant exposure to cyberbullying and body image pressures (Alamy)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook, Youtube and other social media giants should have a “legal duty of care” to ensure they act to protect the mental health and well-being of younger users, MPs have concluded.

The government has also been told to examine legislation which would ensure firms share data which can help identify and protect those at risk from the negative impact of such sites.

A report by the Commons Science and Technology Committee said the current loose “patchwork” of regulation has resulted in a “standards lottery” that could not ensure the safety of young internet users.

The sites are disrupting young users’ sleep patterns, distorting their body image and leaving them exposed to bullying, grooming and sexting, the report said.

The committee also recommended that the Government set itself the “ambitious” target of halving online reports of child sexual exploitation and abuse within two years and eliminating it in four years.

“Worryingly, social media companies – who have a clear responsibility towards particularly young users – seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world,” said the committee's chair Norman Lamb.

The report called on the Government to use its upcoming Online Harms White Paper to put legislation and regulation in place.

“We concluded that self-regulation will no longer suffice,” it said. “We must see an independent, statutory regulator established as soon as possible, one which has the full support of the Government to take strong and effective actions against companies who do not comply.”

A spokesman for the Department for Digital, Culture, Media and Sport, which has worked on the white paper with the Home Office, said: “We have heard calls for an Internet Regulator and to place a statutory ‘duty of care’ on platforms, and are seriously considering all options.

“Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people. Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not.”

Earlier this week, new Facebook head of global affairs Sir Nick Clegg acknowledged that government had a place in regulating social networks.

Andy Burrows, associate head of child safety online at the National Society for the Prevention of Cruelty to Children (NSPCC) charity, said social media sites had been allowed to operate in a “Wild West” environment for too long.

Support free-thinking journalism and attend Independent events

“It’s hugely significant that the committee is endorsing the NSPCC’s proposal for a legal duty of care to be imposed on these tech companies,” he said. “This must include an independent statutory regulator with enforcement powers, that can impose strong sanctions on platforms that fail to keep children safe.”

In response, a Twitter spokesman said: “Improving the health of the conversation online remains our number one priority. In 2018 alone, we introduced more than 70 changes to product, policy and processes to achieve a healthier, safer Twitter. We are committed to building on this progress.”

Additional reporting by Press Association

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in