UK's new online safety law adds to crackdown on Big Tech companies
British lawmakers have approved an ambitious but controversial new internet safety law with wide-ranging powers to crack down on digital and social media companies like TikTok, Google and Facebook and Instagram parent Meta
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.British lawmakers have approved an ambitious but controversial new internet safety law with wide-ranging powers to crack down on digital and social media companies like TikTok, Google, and Facebook and Instagram parent Meta.
The government says the online safety bill passed this week will make Britain the safest place in the world to be online. But digital rights groups say it threatens online privacy and freedom of speech.
The new law is the U.K.’s contribution to efforts in Europe and elsewhere to clamp down on the freewheeling tech industry dominated by U.S. companies. The European Union has its Digital Services Act, which took effect last month with similar provisions aimed at cleaning up social media for users in the 27-nation bloc.
Here's a closer look at Britain's law:
WHAT IS THE ONLINE SAFETY LAW?
The sprawling piece of legislation has been in the works since 2021.
The new law requires social media platforms to take down illegal content, including child sexual abuse, hate speech and terrorism, revenge porn and posts promoting self-harm. They also will have to stop such content from appearing in the first place and give users more controls, including blocking anonymous trolls.
The government says the law takes a “zero tolerance” approach to protecting kids by making platforms legally responsible for their online safety. Platforms will be required to stop children from accessing content that, while not illegal, could be harmful or not age-appropriate, including porn, bullying or, for example, glorifying eating disorders or providing instructions for suicide.
Social media platforms will be legally required to verify that users are old enough, typically 13, and porn websites will have to make sure users are 18.
The bill criminalizes some online activity, such as cyberflashing, which is sending someone unwanted explicit images.
WHAT IF BIG TECH DOESN'T COMPLY?
The law applies to any internet company, no matter where it’s based as long as a U.K. user can access its services. Companies that don't fall in line face fines of up to 18 million pounds ($22 million) or 10% of annual global sales, whichever is greater.
Senior managers at tech companies also face criminal prosecution and prison time if they fail to answer information requests from U.K. regulators. They'll also be held criminally liable if their company fails to comply with regulators' notices about child sex abuse and exploitation.
Ofcom, the U.K. communications regulator, will enforce the law. It will focus first on illegal content as the government takes a “phased approach” to bring it into force.
Beyond that, it’s unclear how the law will be enforced because details haven’t been provided.
WHAT DO CRITICS SAY?
Digital rights groups say the law's provisions threaten to undermine online freedoms.
The U.K.-based Open Rights Group and the Electronic Frontier Foundation in the U.S. said that if tech companies have to ensure content is not harmful for children, they could end up being forced to choose between sanitizing their platforms or making users verify their ages by uploading official ID or using privacy-intrusive face scans to estimate how old they are.
The law also sets up a clash between the British government and tech companies over encryption technology. It gives regulators the power to require encrypted messaging services to install “accredited technology" to scan encrypted messages for terrorist or child sex abuse content.
Experts say that would provide a backdoor for private communications that ends up making everyone less safe.
Meta said last month that it plans to start adding end-to-end encryption to all Messenger chats by default by the end of year. But the U.K. government called on the company not to do so without measures to protect children from sex abuse and exploitation.