Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Face checking software fails airports

Charles Arthur,Technology Editor
Wednesday 29 May 2002 00:00 BST
Comments

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Tests of face-recognition software, touted as an ideal way to identify potential hijackers at airports, have been a failure.

A trial in Florida of a system made by Visionics, which is also used by Newham council in east London, found that on average it wrongly identified 1 per cent of people as being on a "wanted" list. They had just 250 pictures but there would be thousands on a realistic database of potential criminals.

Such a "false positive" rate would mean 50 people being wrongly stopped at a flight gate or desk that handled 5,000 people a day. Heathrow, with its four terminals, handles 175,000 passengers a day, which would translate as 1,750 people being stopped after wrong identification, causing delays and chaos.

Documents obtained under freedom of information laws by the American Civil Liberties Union also revealed that halfway through the tests the system was able to catch only half of the people it should have, meaning that 50 per cent of "wanted" ones got through.

The system works by taking 80 measurements of facial structures, such as the distance between the eyes and the cheekbones. From these it builds a digital "faceprint", which the company says can distinguish one person from another like a fingerprint.

Visionics insisted yesterday that the "false positive" rate was a natural corollary of the system's detection method. Frances Zelazny, a spokeswoman, said: "There will always be a compromise between falsely accepting someone as OK and falsely rejecting them. But if this system had been used on 11 September and caught nine of the [19] hijackers, that might have stopped the whole operation. So would it have been a failure?"

The documents showed that airport staff reported many problems with the natural variations in lighting, orientation and dress of people it scanned. Moving heads have a significant effect on the system, with "substantial loss in matching" if people looked between 15 and 30 degrees away from the camera's focus.

Visionics had been riding high on its promise. On 24 September, its founder, Joseph Atick, said it would create a "nationwide shield" that would apprehend terrorists attempting to board aircraft. The company's stock price soared.

But the system would have to have had pictures of the would-be terrorists, implying that even if perfect Visionics software had been in place, the hijackers would have been passed without delays.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in