Metropolitan Police's facial recognition technology 98% inaccurate, figures show
'Intrinsically Orwellian' systems must be scrapped, campaigners say as biometrics commissioner brands them 'not yet fit for use'
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facial recognition software used by the UK’s biggest police force has returned false positives in more than 98 per cent of alerts generated, The Independent can reveal, with the country’s biometrics regulator calling it “not yet fit for use”.
The Metropolitan Police’s system has produced 104 alerts of which only two were later confirmed to be positive matches, a freedom of information request showed. In its response the force said it did not consider the inaccurate matches “false positives” because alerts were checked a second time after they occurred.
Facial recognition technology scans people in a video feed and compares their images to pictures stored in a reference library or watch list. It has been used at large events like the Notting Hill Carnival and a Six Nations Rugby match.
The system used by another force, South Wales Police, has returned more than 2,400 false positives in 15 deployments since June 2017. The vast majority of those came during that month’s Uefa Champion’s League final in Cardiff, and overall only 234 alerts – fewer than 10 per cent – were correct matches.
Both forces are trialling the software.
The UK’s biometrics commissioner, Professor Paul Wiles, told The Independent that legislation to govern the technology was “urgently needed”.
He said: “I have told both police forces that I consider such trials are only acceptable to fill gaps in knowledge and if the results of the trials are published and externally peer-reviewed. We ought to wait for the final report, but I am not surprised to hear that accuracy rates so far have been low as clearly the technology is not yet fit for use.
“In terms of governance, technical development and deployment is running ahead of legislation and these new biometrics urgently need a legislative framework, as already exists for DNA and fingerprints.
“The Home Office has promised to publish a biometric strategy in June and I trust that this will propose a legislative framework. It is important in terms of public trust that the public are clear when their biometrics might be taken and what they might be used for, and that parliament has decided those rules.”
But a Home Office spokesman admitted this week that the department could not say when the long-delayed biometrics strategy would be published.
Campaigners said the “intrinsically Orwellian” facial recognition software should be scrapped, while a senior academic warned that governments faced “grave challenges” in preventing potential abuse of the technology.
Silkie Carlo, director of the Big Brother Watch pressure group, which is to launch a campaign on the issue on Tuesday, said: “Police must immediately stop using real-time facial recognition if they are to stop misidentifying thousands of innocent citizens as criminals.
“It is an intrinsically Orwellian police tool that has resulted in ordinary people being stopped and asked for their ID to prove their innocence.
“It is alarming and utterly reckless that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to basic democratic freedoms. It must be dropped.”
Tao Zhang, a senior lecturer at Nottingham Trent University, told The Independent that a lack of open debate about facial recognition technology “could clearly be exploited by an authoritarian state for purpose of political control, as the case of China illustrates”.
While checks and balances existed in democracies like Britain, she added, “with such a rapidly developing technology, there is danger that public policy may not keep pace”.
Dr Zhang added: “From medical research, healthcare to crime control and many other fields, facial recognition potentially has huge benefits, but it also imposes grave challenges for the government to prevent commercial and political exploitation of it for illegal acts.”
The Met told The Independent no end date for its experiment had been set. The force said it had made no arrests through the system, and deleted images involved in false positive matches within 30 days of the error. Images that do not generate alerts are “immediately” deleted, a spokesman said.
At least year’s Notting Hill Carnival, however, one person was reportedly detained erroneously following the use of facial recognition. The Met insisted they were not technically arrested, and instead released when officers realised they had already been dealt with for a public order offence, Sky News reported at the time.
The Met’s spokesman said: “The equipment was used at the past two Notting Hill Carnivals and at the 2017 Remembrance Sunday service to assess if it could assist police in identifying known offenders in large events, in order to protect the wider public.
“While we are trialling this technology we have engaged with the mayor’s Office for Policing and Crime ethics panel, Home Office biometrics and forensics ethics panel, surveillance camera commissioner, the information commissioner, the biometrics commissioner, and Big Brother Watch. Liberty was invited to observe its use at the carnival last year.”
South Wales Police said it operated two systems – one called Locate, the real-time software to which the false positives figure relates, and the other Identify. Identify is used more than 100 times a month to help track down criminals, it said, while Locate is deployed “when deemed proportionate” like at major sporting events.
Locate’s image thumbnails are deleted within 31 days, the force said in a statement. That system has contributed to the arrest of 24 people since last June, while Identify was involved in 450 detentions.
A SWP spokeswoman added: “The use of automated facial recognition (AFR) in South Wales Police has to be for a policing purpose and all enquiries have to deemed as being proportionate for the intended purpose.
“[It] is still in project phase and as such there is a robust governance structure around its use with bi-monthly project boards chaired by chief officers.
“Running in parallel to this is the AFR strategic partnership board, which includes representatives from strategic partners.
“The deployment and use of AFR is governed by the Protection of Freedom Act 2012 with oversight provided by the surveillance camera commissioner.”
In an interview with The Independent, Tony Porter, the surveillance camera commissioner, said he was concerned by the number of false positives AFR technology produced.
He said: “The cause of concern goes right across the whole use of a surveillance camera. I’ve got concerns about the quality of the technology.
“That they could be discriminatory against race, sexual orientation and even age causes me concern.
“I’ve got concerns about the quantity of false positives. I’ve got concerns that the commercial sector may be using it without a proper policy of interdiction.”
And, due to delays at the Home Office, police forces were taking the lead in developing a governance structure for facial recognition, he said.
For that reason they must ensure they were transparent and accountable in their use of the technology, and that the public could understand it, he added. “There is no space, in this area, for covert use of this type of equipment that’s being used in a public space.”
But eventually, “I think there needs to be much more of a framework put around this equipment because it’s very invasive”, Mr Porter said, calling for a “proper, robust and effective regulation regime”.
Nevertheless, he said both SWP and the Met had been working with him on the issue and that their trials were legitimate. “I firmly believe that the police must use new technology. They’re not Luddites. There’s not a regulatory framework and to ask them to desist ... They’ve already been waiting five years. I don’t blame the police for piloting it.”
Mr Porter said a council including him, police chiefs and civil liberties groups was due to meet later this month and that facial recognition would be the main item on the agenda.