Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission.

Amazon facial recognition falsely matches more than 100 politicians to arrested criminals

Anthony Cuthbertson
Thursday 28 May 2020 12:01 BST
Comments
Amazon's facial recognition software incorrectly matched more than 100 photos of US and UK lawmakers with police arrest photos
Amazon's facial recognition software incorrectly matched more than 100 photos of US and UK lawmakers with police arrest photos (Comparitech)

Amazon's controversial facial recognition technology has incorrectly matched more than 100 photos of politicians in the UK and US to police mugshots, new tests have revealed.

Amazon Rekognition uses artificial intelligence software to identify individuals from their facial structure. Customers include law enforcement and US government agencies like Immigration and Custome Enforcement (ICE).

It is not the first time the software's accuracy has been called into question. In July 2018, the American Civil Liberties Union (ACLU) found 28 false matches between US Congress members and pictures of people arrested for a crime.

Technology research and comparison firm Comparitech built upon the ACLU's study by adding UK politicians, uncovering 105 false positives from the 1,959 images tested.

Of the US lawmakers tested, Rekognition incorrectly matched an average of 32 politicians to mugshots, when set to a confidence level of 80 per cent - the default setting and the same that was used in the ACLU study.

"By those standards, Amazon's face recognition hasn't improved and even performed worse that what the ACLU posited two years ago," Comparitech wrote in a blog post detailing the findings.

"Whether you agree with police use of face recognition or not, one thing is certain: It isn't ready to be used for identification without human oversight."

Amazon did not immediately respond to a request for comment from The Independent.

Following the 2018 study, Amazon said that for law enforcement purposes the confidence level of the software should be set to 95 per cent, rather than the 80 per cent level used by the ACLU.

"The 80 per cent confidence threshold used by the ACLU is far too low to ensure the accurate identification of individuals; we would expect to see false positives at this level of confidence," Matt Wood, Amazon Web Service's VP of artificial intelligence, wrote in a blog post at the time.

"We recommend 99 per cent for use cases where highly accurate face similarity matches are important... WE continue to recommend that customers do not use less than 99 per cent confidence levels for law enforcement matches, and then to only use the matches as one input across others that make sense for each agency."​

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in