The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.
Amazon facial recognition falsely matches more than 100 politicians to arrested criminals
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Amazon's controversial facial recognition technology has incorrectly matched more than 100 photos of politicians in the UK and US to police mugshots, new tests have revealed.
Amazon Rekognition uses artificial intelligence software to identify individuals from their facial structure. Customers include law enforcement and US government agencies like Immigration and Custome Enforcement (ICE).
It is not the first time the software's accuracy has been called into question. In July 2018, the American Civil Liberties Union (ACLU) found 28 false matches between US Congress members and pictures of people arrested for a crime.
Technology research and comparison firm Comparitech built upon the ACLU's study by adding UK politicians, uncovering 105 false positives from the 1,959 images tested.
Of the US lawmakers tested, Rekognition incorrectly matched an average of 32 politicians to mugshots, when set to a confidence level of 80 per cent - the default setting and the same that was used in the ACLU study.
"By those standards, Amazon's face recognition hasn't improved and even performed worse that what the ACLU posited two years ago," Comparitech wrote in a blog post detailing the findings.
"Whether you agree with police use of face recognition or not, one thing is certain: It isn't ready to be used for identification without human oversight."
Amazon did not immediately respond to a request for comment from The Independent.
Following the 2018 study, Amazon said that for law enforcement purposes the confidence level of the software should be set to 95 per cent, rather than the 80 per cent level used by the ACLU.
"The 80 per cent confidence threshold used by the ACLU is far too low to ensure the accurate identification of individuals; we would expect to see false positives at this level of confidence," Matt Wood, Amazon Web Service's VP of artificial intelligence, wrote in a blog post at the time.
"We recommend 99 per cent for use cases where highly accurate face similarity matches are important... WE continue to recommend that customers do not use less than 99 per cent confidence levels for law enforcement matches, and then to only use the matches as one input across others that make sense for each agency."
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments