Police testing technology to 'assess the risk of someone committing a crime' in UK
Critics say 'Minority Report-style policing' must be stopped
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Police are testing technology that aims to “assess the risk of someone committing a crime or becoming a victim” in the UK.
The government has pledged £5m of further funding to develop the National Data Analytics Solution (NDAS), which it hopes will be ruled out across England and Wales.
The system is entering its second year of testing by West Midlands Police and has so far used police data on knife and gun offences to identify patterns and common traits among perpetrators.
“NDAS analyses large volumes of police-held data to assess the risk of someone committing a crime or becoming a victim,” a Home Office spokesperson said. “The programme is designed to support police officers and does not replace their decision making.”
Officials said it was also processing crime reports and intelligence on modern slavery in an attempt to identify risk factors and networks.
The Independent understands that West Midlands Police are not yet using the technology as a basis for making arrests or interventions.
The campaign group Big Brother Watch said the technology risked “criminalising innocent people” and undermining the presumption of innocence.
“It's shocking that the Home Office is squandering millions from the public purse on this dystopian predictive policing system,” said legal and policy officer Griff Ferris. ”There is no public appetite for Minority Report-style policing in this country. It should be scrapped immediately.“
Mr Ferris highlighted data showing that black people are already nine times more likely to be stopped and searched than white people and said West Midlands Police’s ethics committee had raised “serious legal and ethical concerns” about the new technology.
The Home Office said the force was working with experts and non-governmental organisations to ensure “robust ethical oversight” in the wake of legal challenges and investigations over police trials of automatic facial recognition.
“Once fully tested, it is hoped NDAS would be made available to forces England and Wales who want to use it to improve their performance and to protect the public,” a spokesperson added.
The programme was awarded £4.5m government funding in 2018-19 and is now being given another £5m from the Police Transformation Fund.
Superintendent Nick Dale, who leads on NDAS for West Midlands Police, said officers were still at an early stage in identifying how best machine learning technology can be used.
“This technology has the potential to help us understand modern slavery networks – the hidden crime within our communities – so much better, as well as the problems that lead to serious violence that blights communities and affects the lives of victims and perpetrators,” he added. “It is really important that our work is scrutinised independently from an ethical point of view, and that technology will never replace professional judgement or affect the police’s accountability for our actions.”
A 2017 report by the Royal United Services Institute for Defence and Security Studies (RUSI) said British forces already have access to huge amounts of data but lack the capability to use it.
But the report warned that machine learning can “reproduce the inherent biases present in the data they are provided with” and assess ethnic and religious minorities as an increased risk.
“Acting on these predictions will then result in those individuals being disproportionately targeted by police action, creating a ‘feedback loop’ by which the predicted outcome simply becomes a self-fulfilling prophecy,” RUSI warned.
Some forces have been testing different software and artificial intelligence tools, but have been hampered by outdated technology, fragmented national systems and a lack of national leadership on the controversial issue.
Greater Manchester Police developed its own “predictive crime mapping” software in 2012 and Kent Police has been using a system called PredPol since 2013.
The programmes aim to identify where and when offences will take place, causing officers to be send on targeted patrols aiming to prevent them.
Durham Constabulary has been developing an artificial intelligence-based system to evaluate the risk of convicts reoffending, called the Harm Assessment Risk Tool (HART).
It puts information on a person’s past offending history, age, postcode and other background characteristics through algorithms that then classify them as a low, medium or high risk.
Sajid Javid, the home secretary, said: “I fully support the police embracing innovative new technology in the fight against crime and to protect the most vulnerable victims. Anything we can do to stay one step ahead of the criminals should be welcomed – providing it is rigorously tested and ethically sound.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments