Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Police testing technology to 'assess the risk of someone committing a crime' in UK

Critics say 'Minority Report-style policing' must be stopped

Lizzie Dearden
Home Affairs Correspondent
Wednesday 17 July 2019 07:16 BST
Comments
The Home Office hopes the technology can be rolled out through England and Wales to help police officers prevent crime
The Home Office hopes the technology can be rolled out through England and Wales to help police officers prevent crime (Getty)

Police are testing technology that aims to “assess the risk of someone committing a crime or becoming a victim” in the UK.

The government has pledged £5m of further funding to develop the National Data Analytics Solution (NDAS), which it hopes will be ruled out across England and Wales.

The system is entering its second year of testing by West Midlands Police and has so far used police data on knife and gun offences to identify patterns and common traits among perpetrators.

“NDAS analyses large volumes of police-held data to assess the risk of someone committing a crime or becoming a victim,” a Home Office spokesperson said. “The programme is designed to support police officers and does not replace their decision making.”

Officials said it was also processing crime reports and intelligence on modern slavery in an attempt to identify risk factors and networks.

The Independent understands that West Midlands Police are not yet using the technology as a basis for making arrests or interventions.

The campaign group Big Brother Watch said the technology risked “criminalising innocent people” and undermining the presumption of innocence.

“It's shocking that the Home Office is squandering millions from the public purse on this dystopian predictive policing system,” said legal and policy officer Griff Ferris. ”There is no public appetite for Minority Report-style policing in this country. It should be scrapped immediately.“

Mr Ferris highlighted data showing that black people are already nine times more likely to be stopped and searched than white people and said West Midlands Police’s ethics committee had raised “serious legal and ethical concerns” about the new technology.

The Home Office said the force was working with experts and non-governmental organisations to ensure “robust ethical oversight” in the wake of legal challenges and investigations over police trials of automatic facial recognition.

“Once fully tested, it is hoped NDAS would be made available to forces England and Wales who want to use it to improve their performance and to protect the public,” a spokesperson added.

The programme was awarded £4.5m government funding in 2018-19 and is now being given another £5m from the Police Transformation Fund.

Superintendent Nick Dale, who leads on NDAS for West Midlands Police, said officers were still at an early stage in identifying how best machine learning technology can be used.

Police are trailling controversial facial recognition technology in Stratford

“This technology has the potential to help us understand modern slavery networks – the hidden crime within our communities – so much better, as well as the problems that lead to serious violence that blights communities and affects the lives of victims and perpetrators,” he added. “It is really important that our work is scrutinised independently from an ethical point of view, and that technology will never replace professional judgement or affect the police’s accountability for our actions.”

A 2017 report by the Royal United Services Institute for Defence and Security Studies (RUSI) said British forces already have access to huge amounts of data but lack the capability to use it.

But the report warned that machine learning can “reproduce the inherent biases present in the data they are provided with” and assess ethnic and religious minorities as an increased risk.

“Acting on these predictions will then result in those individuals being disproportionately targeted by police action, creating a ‘feedback loop’ by which the predicted outcome simply becomes a self-fulfilling prophecy,” RUSI warned.

Some forces have been testing different software and artificial intelligence tools, but have been hampered by outdated technology, fragmented national systems and a lack of national leadership on the controversial issue.

Greater Manchester Police developed its own “predictive crime mapping” software in 2012 and Kent Police has been using a system called PredPol since 2013.

The programmes aim to identify where and when offences will take place, causing officers to be send on targeted patrols aiming to prevent them.

Support free-thinking journalism and attend Independent events

Durham Constabulary has been developing an artificial intelligence-based system to evaluate the risk of convicts reoffending, called the Harm Assessment Risk Tool (HART).

It puts information on a person’s past offending history, age, postcode and other background characteristics through algorithms that then classify them as a low, medium or high risk.

Sajid Javid, the home secretary, said: “I fully support the police embracing innovative new technology in the fight against crime and to protect the most vulnerable victims. Anything we can do to stay one step ahead of the criminals should be welcomed – providing it is rigorously tested and ethically sound.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in