Facial recognition technology 'violates human rights and must end', landmark court case hears
First legal challenge against UK police trials of facial recognition to set precedent
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Police are breaking human rights law with the use of controversial facial recognition software, a landmark court case has heard.
Ed Bridges is bringing the first legal challenge on the technology, which has been trialled by forces in different parts of the UK.
He believes he was scanned by cameras used by South Wales Police at a peaceful anti-arms trade protest in 2018 and while doing his Christmas shopping in Cardiff months before.
Mr Bridges said police started using facial recognition “without warning or consultation”.
He added: “It’s hard to see how the police could possibly justify such a disproportionate use of such an intrusive surveillance tool like this. We hope that the court will agree with us that unlawful use of facial recognition must end, and our rights must be respected.”
Lawyers accused the force of violating Mr Bridges’ privacy and data protection rights by processing an image taken of him in public.
Dan Squires QC told the Administrative Court in Cardiff that automatic facial recognition (AFR) allowed police to “monitor people's activity in public in a way they have never been able to do before” without having to gain consent.
He said: “The reason AFR represents such a step change is you are able to capture almost instantaneously the biometric data of thousands of people. It has profound consequences for privacy and data protection rights, and the legal framework which currently applies to the use of AFR by the police does not ensure those rights are sufficiently protected."
Mr Bridges had a reasonable expectation that his face would not be scanned in a public space and processed without his consent while he was not suspected of wrongdoing, Mr Squires said.
The lawyer argued that police had violated article eight of the Human Rights Act - respect for privacy - as well as the Data Protection Act.
Mr Squires said there was no statutory power which permitted South Wales Police to perform large-scale processing of people's data without their consent, and called for a a code of conduct to be drawn up.
Facial recognition scans faces from live camera footage compares results with a ”watch list“ of images from a police database, which can use varying parameters set to include suspects, missing people and persons of interest.
The court heard that South Wales Police had deployed facial recognition at least 40 times since it began trialling it May 2017, with no end date set.
The force argues that use of AFR does not infringe the privacy or data protection rights because it is used in the same way as photographing a person's activities in public.
It said it does not permanently retain the data of people who are not confirmed as a match to its watchlist, but does keep CCTV images from the scanning process for up to 31 days.
Mr Bridges raised almost £6,000 for the judicial review, writing on a fundraising website that the public “had not be consulted” on facial recognition.
“The police are supposed to protect us and make us feel safe – but I think the technology is intimidating and intrusive,” he added. “There’s not even any guidance on how to deploy it, and no independent oversight to make sure its use is appropriate and our rights are protected.”
He is being supported by the Liberty human rights charity, which said facial recognition “makes a mockery of our right to privacy”.
Lawyer Megan Gouldin said: “It is discriminatory and takes us another step towards being routinely monitored wherever we go, fundamentally altering our relationship with state powers and changing public spaces. It belongs to a police state and has no place on our streets.”
Studies have shown some facial recognition software disproportionately misidentifies women and ethnic minorities, while a man was fined after covering his face in London.
South Wales Police did not seek to block the challenge and its chief constable indicated that he would welcome guidance on complex legal and ethical issues. Jeremy Johnson QC will set out the force's response to the case.
London's Metropolitan Police is facing a separate legal challenge over its own facial recognition trial, which has seen members of the public misidentified as criminals in 96 per cent of scans.
The trials have so far cost more than £222,000 in the capital along and are subject to a separate probe by the Information Commissioner.
Additional reporting by Press Associaton
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments