Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Criminal justice software algorithm used across the US is biased against black inmates, study finds

A tool used to figure out how likely a prisoner is to reoffend is skewed favourably towards white defendants

 

Rachael Revesz
New York
Monday 27 June 2016 17:45 BST
Comments
The algorithm often incorrectly flagged white defendants as less likely to commit another crime
The algorithm often incorrectly flagged white defendants as less likely to commit another crime (Reuters)

An algorithm that is used across the US to decide how likely prisoners are to reoffend is biased against black people, an investigation by ProPublica has found.

The COMPAS tool (Correctional Offender Management Profiling for Alternative Sanctions), created by Northpointe, plays a role across the US in determining when criminal defendants should be released.

Northpointe’s “risk assessment” score found that black people were almost twice as likely than white people to be assessed as “higher risk” when it came to the probability that they would commit another crime.

The investigative study compared the predictions for 10,000 inmates in Broward County, Florida, with the rate of reoffending that actually occurred over a two-year period.

Using questions like: “Was one of your parents ever sent to jail or prison?” and “How often did you get in fights while at school?” avoided asking about race outright, but were often related.

Other questions assessed education levels and employment.

Black people were found to be 77 per cent more likely to commit a violent crime in the future and 45 per cent more likely to commit any kind of crime in the future.

ProPublica also found that only 20 per cent of Northpointe’s predictions on future violent crimes are accurate.

Northpointe stated that it “does not agree that the results of [their] analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.”

Northpointe’s algorithm is just one on the market, but there are dozens of similar tools which are being used increasingly by judges, probation and parole officers, according to ProPublica.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in