London King’s Cross estate admits using facial recognition technology
Watchdog expresses concern tools could ‘undermine people’s privacy’
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Facial recognition technology is reportedly in use at London’s King’s Cross estate, where it tracks thousands of people across the 67-acre development.
The area, which houses office buildings and shopping areas, was redeveloped by Argent LLP.
“These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public,” the property developer said, according to The Financial Times.
The Information Commissioner’s Office said it had “concerns about the potential for inappropriate use of facial recognition technology”.
When asked on Twitter about the story, the ICO’s official account said it was exploring: “ways [facial recognition technology] could undermine people’s privacy.
“Since new data protection laws came into effect on 25 May 2018 there are extra protections for people.
“These require organisations to assess and reduce the privacy risks of using new and intrusive surveillance technologies like automatic facial recognition.
“The ICO is currently looking at the use of facial recognition technology by law enforcement in public spaces and by private sector organisations, including where they are partnering with police forces.
“We’ll consider taking action where we find non-compliance with the law.”
Last month MPs said police forces had to stop using facial recognition technology until a legal framework for its use was set up.
A lack of legislation governing deployment of the technology called into question the legal basis of police trials, the Commons Science and Technology Committee said in a report.
The committee referred to tests carried out by London’s Metropolitan Police and South Wales Police, noting an evaluation of both trials by the Biometrics and Forensics Ethics Group had raised questions about accuracy and bias.
The UK’s biometrics commissioner told The Independent more than a year ago that new laws were “urgently needed” on facial recognition.
The adviser said it was ”important in terms of public trust that the public are clear when their biometrics might be taken and what they might be used for, and that parliament has decided those rules.”
London’s Canary Wharf is also reportedly seeking to trial facial recognition technology.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments