Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Senators question DOJ funding for AI-powered policing tech

Some Capitol Hill Democrats and civil rights advocates are concerned about how police and prosecutors increasingly use algorithm-powered technologies that may amplify racial bias

Via AP news wire
Friday 20 August 2021 20:09 BST
AP Investigation Tracked
AP Investigation Tracked (Copyright 2021 The Associated Press. All rights reserved)

Your support helps us to tell the story

This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.

The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.

Help us keep bring these critical stories to light. Your support makes all the difference.

A Democratic senator said the U.S. Justice Department needs to look into whether the algorithm-powered police technologies it funds contribute to racial bias in law enforcement and lead to wrongful arrests.

Sen. Ron Wyden, an Oregon Democrat, was responding to an investigation by The Associated Press published Thursday about the possibility of bias in courtroom evidence produced by an algorithm-powered technology called ShotSpotter. The system, which can be funded by Justice Department grants, is used by law enforcement in more than 110 U.S. communities to detect and respond to gunshots.

“While there continues to be a national debate on policing in America, it’s become increasingly clear that algorithms and technologies used during investigations, like ShotSpotter, can further racial biases and increase the potential for sending innocent people to prison,” Wyden said.

Chicago prosecutors used ShotSpotter evidence to jail Michael Williams 65, for a year on a first-degree murder charge for allegedly shooting a man inside his car. ShotSpotter said their system can't be relied on to detect gunshots inside cars. Last month, a judge dismissed the case against Williams at the request of prosecutors, who said they had insufficient evidence.

“Fundamentally, these tools are outsourcing critical policing decisions, leaving the fate of people like Michael Williams to a computer,” Wyden said.

In Chicago, where Williams was jailed, community members rallied in front of a police station on Thursday, demanding the city end its contract with ShotSpotter, a system they said “creates a dangerous situation where police treat everyone in the alert area as an armed threat.”

The Chicago Police Department on Friday defended the technology in response to calls to end the city’s ShotSpotter contract. Chicago is ShotSpotter’s largest customer.

“ShotSpotter has detected hundreds of shootings that would have otherwise gone unreported,” it said in a statement emailed to the AP, adding that the technology is just one of many tools the department relies on “to keep the public safe and ultimately save lives.”

It said real-time ShotSpotter alerts about gunshots mean officers respond faster and more consistently than when depending on someone to call 911 to report gunfire.

“The system gives police the opportunity to reassure communities that law enforcement is there to serve and protect them and helps to build bridges with residents who wish to remain anonymous,” the department said.

ShotSpotter uses a secret algorithm to analyze noises detected by sensors mounted on light poles and buildings. Employees at the company’s Incident Review Centers in Washington, D.C., and Newark, California, look at the wavelengths and listen to sounds that the computer deems possible gunshots to make a final determination before alerting police.

“The point is anything that ultimately gets produced as a gunshot has to have eyes and ears on it,” said CEO Ralph Clark in an interview. “Human eyes and ears, OK?”

Civil rights advocates say the human reviews can introduce bias.

Wyden said he and seven other Democratic lawmakers are still waiting for a Justice Department response to their April letter raising concerns about federal funds going to local law enforcement agencies to buy artificial intelligence technologies. In addition to Wyden, the letter was signed by Sens. Ed Markey and Elizabeth Warren of Massachusetts, Alex Padilla of California, Raphael Warnock of Georgia, and Jeff Merkley of Oregon, and U.S. Reps. Yvette Clarke of New York and Sheila Jackson Lee of Texas.

“These algorithms, which automate policing decisions, not only suffer from a lack of meaningful oversight regarding whether they actually improve public safety, but it is also likely they amplify biases against historically marginalized groups," they wrote to Attorney General Merrick Garland.

The Justice Department did not respond to AP's request for comment.

___

Mendoza reported from Newark, California.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in