iMessage, WhatsApp and other chat apps to scan people’s messages for child abuse and ‘grooming’ under new EU law

The law ‘describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR’, one cryptography professor tweeted

Adam Smith
Wednesday 11 May 2022 16:20 BST
Comments
(Unsplash)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

The European Commission has proposed new regulation that would scan messaging apps like WhatsApp, iMessage, and FacebookMessenger for child sexual abuse material and “grooming” content.

“This document is the most terrifying thing I’ve ever seen”, Matthew Green, a cryptography professor at Johns Hopkins, tweeted after a leak of the legislation was shared. “It describes the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR. Not an exaggeration.”

The regulation would not only scan for existing child abuse material, but also scan for new child sexual abuse material or grooming – thereby giving surveillance powers to authorities to scan conversations happening within some of the most popular platforms in the world should they receive a “detection order” that would use artificial intelligence to scan pictures and text messages.

“Detection orders are limited in time, targeting a specific type of content on a specific service”, the European Commission says, adding that they will be issued by courts or independent national authorities.

“Detection technologies must only be used for the purpose of detecting child sexual abuse. Providers will have to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate of false positives to the maximum extent possible.”

It also states that app stores must ensure children cannot download apps that may expose them to a high risk of solicitation.

Experts warned that the introduction of the powers for European governments would make them available to other governments. “By legally mandating the construction of these surveillance systems in Europe, the European government will ultimately make these capabilities available to every government”, Professor Green wrote.

Other privacy experts have echoed this criticism. The proposal is “incompatible with end-to-end encryption and with basic privacy rights,” Joe Mullin, senior policy analyst at the digital rights group Electronic Frontier Foundation, told CNBC.

“There’s no way to do what the EU proposal seeks to do, other than for governments to read and scan user messages on a massive scale,” Mullin continued. “If it becomes law, the proposal would be a disaster for user privacy not just in the EU but throughout the world.”

Many governments – including the UK, the US, and across Europe – have attempted to erode privacy for users by asking technology giants to put backdoors in end-to-end encrypted chats, a move that would essentially make them more vulnerable to criminals.

While the plan does not outright call for an end to end-to-end encryption, it is unclear how it would be carried out without undermining it.

“Criminals are already using distribution channels that would not be affected by these scans and will easily escape scans in the future,” Linus Neumann of the German hacker collective Chaos Computer Club said.

WhatsApp head, Will Cathcart, also criticised the bill in a Twitter thread. “Incredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption”, he wrote. “If the EU mandates a scanning system like this be built for one purpose in the EU, it will be used to undermine human rights in many different ways globally.”

He continued: “Legislators need to work with experts who understand internet security so they don't harm everyone, and focus on ways we can protect children while encouraging privacy on the internet.”

Apple had previously attempted to introduce anti-child abuse features that used similar technology. The tools would have attempted to detect when children are being sent inappropriate photos, and when people have child sexual abuse material on their devices.

However, critics said that the tools could be used to scan for other kinds of material and undermined Apple’s public commitment to privacy as a human right. In September, Apple said that it would indefinitely delay those features until they were improved.

“Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features”, the iPhone giant said in a statement.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in