Facial recognition may be in its infancy but soon enough it will raise difficult questions for us all

We need a more robust framework for an emerging technology with much potential for harm and error

Monday 17 December 2018 18:18 GMT
Comments
An unmarked police van carrying facial recognition cameras and software on deployment in London's West End on 17 December 2018
An unmarked police van carrying facial recognition cameras and software on deployment in London's West End on 17 December 2018 (Lizzie Dearden)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

The main criticism about the widespread use of facial recognition technology by the police, now being trialled by the Metropolitan Police in London, is that it doesn’t actually work. Paradoxically, this disarms the many critics of the system because such a surveillance technique cannot be deemed a threat to liberty if it achieves a pitifully low rate of success.

It is no good for catching criminals, or at least of very, limited value; but, by the same token, poses a fairly limp threat to personal privacy. Wearing a comedy Groucho Marx disguise would enable even the most vicious terrorist to evade detection, if not ridicule. The fact that the success rate is around 2 per cent is, if anything, remarkable given the circumstances.

All that, though, is to miss the point. Before long facial recognition software could grow far more reliable and sophisticated. So it requires some thought and effort now both to control its use and win public support.

Mass screenings, of the type currently underway in Oxford Street, are wrong in principle if they are covert rather than overt and there is inadequate public consultation or explanation made as to what is going on. This is certainly the case with the Met’s experiments. Not the least of the possible problems is the preference among some Muslim people for their images not to be captured in photography.

Although announced by way of a press release, the many shoppers traipsing round Oxford Street whose facial signatures were being registered seem unaware of what is going on. They may or may not object to it, on a variety of grounds, innocent or nefarious. However, it is wrong, again in principle, if such a procedure were undertaken without people knowing and without good cause, such as the threat of an imminent terror attack. It cannot be right that the Met is conducting these experiments in a regulatory vacuum, especially when there is so much justified concern about privacy of personal details held in digital form.

There is a case to be made for the targeted use of such screening, provided it has the consent of those taking part. Taylor Swift, for example, is reported to use the software in order to screen potentially dangerous stalkers form her concerts. That, in principle, is no different to a doorkeeper with an especially good memory preventing troublemakers from entering a nightclub, or a football club using mug shots to stop hooligans entering their grounds.

The country, obviously, has many more urgent issues to settle before facial recognition becomes anything like a routine tool in the armoury of law enforcement and counter-terrorism. However, it is not too early for the authorities themselves to develop a more robust set of guidelines than exist at the moment, both for private and public sector users of the software.

The opportunities for snooping, misuse and blackmail are obvious. Plain-clothes officers, unmarked vans and covert operation generally are not likely to reassure the public. Here, as in so many areas, the law is failing to keep up with rapidly growing technology. Meantime, criminals will continue to evade capture, via light cosmetic surgery and simple disguises to defy the best efforts of the authorities. Like most issues in policing, facial recognition is a cat and misuse game between cops and criminals.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in