Emotion analysis technology could lead to discrimination, watchdog warns
The data protection regulator has warned such technology is ‘immature’ and could discriminate against some people.

Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Businesses should not rely on “immature” biometric technologies that claim to offer emotional analysis of staff, the Information Commissioner’s Office (ICO) has said, warning that such technology could discriminate against some people.
The data protection watchdog’s intervention refers to AI-powered technology which claims to analyse things such as facial movements and expressions, gait, and even gaze tracking, as a way of monitoring the health and well-being of workers.
The ICO said the process of collecting personal data which can focus on subconscious behavioural or emotional responses to try to understand emotions was far riskier than more traditional biometric technologies that are used to verify a person’s identity.
It said that algorithms used in these systems, which have not been sufficiently developed to detect emotional cues, could show bias or even discriminate against some people.
The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science
The regulator has urged organisations to assess the public risk before using such technology, and warned that any firms which do not act responsibly, pose a risk to vulnerable people or fail to meet ICO expectations will be investigated.
“Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever,” ICO deputy commissioner Stephen Bonner said.
“While there are opportunities present, the risks are currently greater.
“At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgments about a person that are inaccurate and lead to discrimination.
“The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science.
“As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.
“The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.”
The ICO also confirmed it would publish new guidance on biometric technology in spring next year to help businesses better understand how and when to use the technology.