Black and Asian workers ‘more likely to be monitored at work’ than white people
Exclusive: Union warns use of artificial intelligence could further extend inequalities in the jobs market without stronger regulation
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Increased workplace surveillance has fuelled warnings that Black and Asian people in Britain face a heightened risk of discrimination at work, The Independent can reveal.
One in three (33 per cent) of minority workers said all their activities in the workplace were monitored, compared to less than one in five (19 per cent) white employees, polling carried out for the Trade Union Congress by Britain Thinks, a strategy consultancy company, shows.
Around one in 13 (8 per cent) of Black and Asian employees said their exact location and movements within a workplace were monitored, using handheld or wearable devices, compared to just one in 25 (4 per cent) white workers, the study found.
Workplace surveillance includes monitoring of emails and calls, webcams on work computers, tracking of when and how much a worker is typing, calls made, and movements made by the worker using CCTV and trackable devices.
The study is thought to be the first of its kind to uncover hard data about racialised groups being monitored at work.
Frances O’Grady, general secretary of the Trades Union Congress (TUC) told The Independent: “Worker surveillance took off during the pandemic. And it’s clear that Black, Asian and minority ethnic workers are bearing the brunt.
“Employers have, delegated some of their most serious decisions including recruitment, promotions and sometimes even sacking to the quirk of an algorithm. That’s not fair or appropriate – and it risks further disadvantaging minority ethnic workers.
“Without modern regulation, surveillance tech will spiral out of control. And it risks further entrenching the inequalities and racism that already exist within our jobs market.
“In failing to bring forward the long-promised employment bill, the government has squandered the opportunity to regulate worker surveillance tech and give workers a right to disconnect.”
This online survey assessed the experiences of 2,209 workers in England and Wales between 14-20 December 2021.
Surveillance covers using facial recognition technology to assess mood, expression and performance, and using tech to collect data about workers to generate performance ratings.
Black and Asian workers shared their experiences of increased surveillance during the pandemic; 40 per cent, compared to 26 per cent of white workers responded.
One in 20 (5 per cent) Black and Asian workers said they were subject to facial recognition technology to monitor mood and expression compared to one in 100 (1 per cent) of white workers.
Not only did Black and Asian workers reporting higher levels of surveillance than their white colleagues, but they could also be subject to discriminatory algorithms being used to make decisions about them at work.
It opens up the prospect of more discrimination against Black and Asian workers, however, facial recognition technology often fails to recognise the faces of darker skinned people in the same way as it does white people - as it is typically tested on the latter.
Just last year, Facebook users who watched a newspaper video featuring Black men were asked if they wanted to “keep seeing videos about primates” by an artificial-intelligence recommendation system, prompting the social media giant to apologise.
This technological blind spot can have a range of damaging consequences in light of the fact that facial recognition technology is used to make important decisions about people such as their performance, or whether they are offered shifts.
To address the pitfalls of this type of surveillance, the TUC union has called for an employment bill, covering a statutory duty to consult unions before an employer introduces the use of artificial intelligence and automated decision-making systems.
It also wants to see the UK’s data protection laws to make clear that those who implement artificial intelligence at work are liable for any resulting discrimination.
Last year, legal experts warned that “huge gaps” in British law over the use of artificial intelligence at work could lead to “widespread” unfair treatment at work, and called for more legal protection for employees.
It comes as a new study by the Fawcett Society and the Runnymede Trust found that 61 per cent of women from minority groups are changing their names, hair and clothes to fit in at work - compared with 44 per cent of white women.
The report, by Broken Ladders published on Wednesday, was based on a survey of 2,000 women of colour in UK workplaces, which the groups said is the largest representative survey of women of colour to date.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments