Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Police free to set own limits on use of facial recognition because law has not caught up, court told

Court told police forces are relying on 'self-restraint' to limit use of controversial technology

Lizzie Dearden
Home Affairs Correspondent
Thursday 23 May 2019 21:22 BST
Comments
Facial recognition trial in London's West End

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Police are using “self-restraint” to govern their use of controversial facial recognition software because there is no proper legal framework, a court has heard.

The UK’s first legal challenge over the technology was told that full guidance must be drawn up to avoid human rights being infringed.

The landmark case was brought against South Wales Police by a man who believes he was automatically scanned at a peaceful anti-arms trade protest and while doing his Christmas shopping in Cardiff.

Ed Bridges said the “unlawful use of facial recognition must end” and his lawyers accused police of violating human rights and data protection laws by processing an image taken of him in public.

A lawyer representing the Information Commissioner - who is conducting a separate probe - told the Administrative Court for Wales, in Cardiff, that a legal framework must be drawn up for automated facial recognition (AFR).

Gerry Facenna QC said there was a lack of clarity on how South Wales Police compiled the “watch list” of individuals AFR is programmed to track down through CCTV cameras, and in what circumstances it should be deployed.

“There is serious doubt whether the legal framework is sufficient,” he added. “It's all a bit ad hoc. There's nothing sufficiently precise and specific.”

Mr Facenna questioned whether people should be able to refuse to be scanned in public, after a man was fined for disorderly conduct after covering his face during a Metropolitan Police trial in London.

Richard O'Brien, a lawyer representing the home secretary, said Sajid Javid “welcomed” the landmark case as an opportunity to gain clarity on how AFR is used, and agreed that “guidance” could be provided to ensure that the public's rights were not infringed.

Dan Squires QC, representing Mr Bridges, told the final day of the hearing that AFR had given the police “extraordinary power”.

He raised questions over whether there was sufficient legal protection against the “arbitrary and disproportionate use” of the scanning technology by police, adding: “The police say that information is not retained for those not on their watch list, but it is not a legal requirement.

"The risk is if AFR can be used routinely across cities through CCTV cameras.

“The way South Wales Police have operated to date has been responsible and limited. But none of that comes from the law. That comes from self-restraint. Our submission is there should be a code of conduct.”

Police are trailling controversial facial recognition technology in Stratford

Lord Justice Haddon-Cave said he would give his judgment at a later date and thanked parties for their submissions on “novel and potentially far-ranging issues”.

Following the hearing, South Wales Police said the force would await the court's ruling on the “lawfulness and proportionality of our decision making” during its trial of AFR.

Deputy Chief Constable Richard Lewis said: “The force has always been very cognisant of concerns surrounding privacy and understands that we, as the police, must be accountable and subject to the highest levels of scrutiny to ensure that we work within the law.

"We await the court's judgment and guidance and will take full cognisance of its findings in any future use of the technology."

On Wednesday, a lawyer representing the force said its use of facial recognition was justified as it deterred crime, is similar to police use of CCTV, and images of a person's face is not stored unless police already have the image of a known individual in their watch list.

Jeremy Johnson QC said the technology “potentially has great utility for the prevention of crime, the apprehension of offenders and the protection of the public”.

"It offers significant value to the public and to public interest,” he told the court, saying the deployments where Mr Bridges was scanned were “justified” and resulted in police successfully identifying three people.

Ed Bridges brought a legal challenge against South Wales Police over its use of facial recognition technology
Ed Bridges brought a legal challenge against South Wales Police over its use of facial recognition technology (Liberty)

Mr Johnson said: “No personal information relating to the complainant was shared with any police officer. He was not spoken to by any police officer. The practical impact on him was limited.”

Facial recognition scans faces from live camera footage compares results with a ”watch list“ of images from a police database, which can use varying parameters set to include suspects, missing people and persons of interest.

The court heard that South Wales Police force had deployed facial recognition at least 40 times since it began trialling it May 2017, with no end date set.

Mr Bridges raised almost £6,000 for the judicial review, writing on a crowdfunding site that the public “had not be consulted” on facial recognition.

He is being supported by the Liberty human rights charity, which said facial recognition “belongs to a police state and has no place on our streets”.

Studies have shown some facial recognition software disproportionately misidentifies women and ethnic minorities, while a man was fined after covering his face in London.

The Metropolitan Police is facing a separate legal challenge over its own facial recognition trial, which has seen members of the public misidentified as criminals in 96 per cent of scans.

Additional reporting by PA

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in