Facial recognition could be 'spectacular own goal', police warned amid accuracy concerns
Only eight arrests were made as a result of facial recognition matches in three years of Metropolitan Police trials
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facial recognition could be a “spectacular own goal” for police if it fails to be accurate and effective, the government has been warned.
MPs raised concerns about the technology after the Metropolitan Police announced the start of live deployments in London.
Only eight arrests were made as a result of facial matches in almost three years of trials in the capital, which saw a high rate of “false positive” alerts wrongly flagging innocent people as wanted criminals.
Two deployments outside the Westfield shopping centre in Stratford in 2018 saw a 100 per cent failure rate and monitors said a 14-year-old black schoolboy was fingerprinted after being misidentified, while in Romford a man was stopped for covering his face.
During a House of Commons debate on Monday, shadow home secretary Diane Abbott said: “To bring in technology which may be inaccurate and may mean the guilty may go unapprehended and the innocent wrongly identified would be a spectacular own goal.”
Liberal Democrat MP Sarah Olney, who asked an urgent question on facial recognition, said an independent review of Metropolitan Police trials had been “damning” and found potential conflicts with human rights law.
“According to analysis of the Met’s test data, 93 per cent of supposed matches in their trials have been wrong,” she added.
“As well as being inaccurate, facial recognition technology has also been shown to be much less accurate in identifying women and ethnic minorities than it has been for identifying white men.”
Following warnings by the Information Commissioner and court challenges, Ms Olney said the legal basis for Scotland Yard’s deployments was “questionable at best”.
Labour's Chi Onwurah told MPs that facial recognition “automates the prejudices of those who design it and the limitations of the data on which it is trained”.
John Whittingdale, a Conservative MP, pointed out that the Surveillance Camera Commissioner had issued a warning over an insufficient legal framework for facial recognition and a lack of transparency in its use.
Labour MP Daniel Zeichner was among the politicians calling for a pause in using the technology until rules were established, adding: “That approach of trying it out and seeing how it goes is exactly the wrong way to maintain public trust.”
Kit Malthouse, the policing minister, argued that the technology would become “more effective and reliable” as it was rolled out.
“None of the evidence in the trials thus far is pointing to that disproportionality,” he said.
“Frankly, if the police were seeking to apprehend the killer of my child, I would want them to consider using this technology.”
He said he could not comment on legal proceedings, after the Big Brother Watch campaign group said it would review its ongoing action against the Metropolitan Police and home secretary.
A separate legal challenge, against South Wales Police, found two facial recognition deployments in Cardiff were lawful but is currently under appeal.
An increasing number of private companies have also been using the technology, sparking questions about public consent and privacy.
The King’s Cross estate in London came under fire for scanning unwitting shoppers after being given Metropolitan Police wanted lists.
Assistant Commissioner Nick Ephgrave called the incident an “inappropriate sharing of images with good intent to look for wanted people” on Friday and said information exchanges with private companies had stopped.
Sheffield’s Meadowhall shopping centre carried out two facial recognition trials in 2018, looking for just three offenders and one missing person.
According to information obtained by the BBC’s File on 4 programme, more than two million shoppers may have had their faces scanned. Only the missing person was found.
Tony Porter, the Surveillance Camera Commissioner, called for an inspection regime to monitor such deployments.
Former minister David Davis called for “explicit regulation” and parliamentary debates.
“We need to see the data, we need to think through the risks,” he said. “The UK as a state, as a government has a habit of collecting data willy nilly on the premise that it might be useful somehow, some-day.”
He also criticised private companies for building up watchlists of suspected wrongdoers for customers, including shops and security firms.
Anna Bacciarelli, a technology researcher with Amnesty International, said: “It is a wild west at the minute.
“We have an utter lack of regulation and a lot to lose. We would ask for a pause on the use of live facial recognition until all of the human rights risks can be addressed.”
The Metropolitan Police said every London deployment would be “bespoke” and target lists of wanted offenders or vulnerable missing people.
Mr Ephgrave said the technology would be primarily used for serious and violent offenders who are at large, as well as missing children and vulnerable people.
He said live facial recognition “makes no decisions” alone, and works by flagging potential facial matches from live footage to watchlists of police images drawn up by officers.
Officers then judge whether the person could be the same and decide whether to question them in order to establish their identity.
Police said any potential “alerts” would be kept for one month, while watchlists will be wiped immediately after each operation.
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments