Self-driving cars more likely to drive into black people, study claims

New study suggests autonomous vehicles might be racist

Anthony Cuthbertson
Wednesday 06 March 2019 13:58 GMT
Comments
The cameras and sensors on self-driving cars may be better at detecting pedestrians with lighter skin tones
The cameras and sensors on self-driving cars may be better at detecting pedestrians with lighter skin tones (AFP/Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Technology used in self-driving cars has a racial bias that makes autonomous vehicles more likely to drive into black people, a new study claims.

Researchers at the Georgia Institute of Technology found that state-of-the-art detection systems, such as the sensors and cameras used in self-driving cars, are better at detecting people with lighter skin tones.

That makes them less likely to spot black people and to stop before crashing into them, the authors note.

The researchers said they undertook the study after observing higher error rates for certain demographics by such systems.

Tests on eight image-recognition systems found this bias held true, with their accuracy proving five per cent less accurate on average for people with darker skin.

To prove the hypothesis, the scientists divided a large pool of pedestrian images into two groups of lighter and darker skin using the Fitzpatrick scale – a scientific way of classifying skin colour.

Even when changing the time of day or obstructing the image-detection systems view, the average accuracy remained the same.

"We hope this study provides compelling evidence of the real problem that may arise if this source of capture bias is not considered before deploying these sort of recognition models," the study concluded.

AI researcher Kate Crawford, who was not involved in the study, highlighted the dangers of such systems if these issues are not addressed by the companies developing self-driving cars.

“Pedestrian deaths by self-driving cars are already here – but they're not evenly distributed,” she tweeted.

Other AI experts responded to her tweet by highlighting that the paper did not use datasets used by autonomous vehicle developers, so may not reflect the actual accuracy of real-world systems.

“In an ideal world, academics would be testing the actual models and training sets used by autonomous car manufacturers,” she responded.

“But given those are never made available (a problem in itself), papers like these offer strong insights into very real risks.”

It is not the first time that machine learning and vision systems have been shown to have an in-built bias.

In January, researchers at the Massachusetts Institute of Technology (MIT) found that Amazon's facial recognition software Rekognition had a harder time identifying a person's gender if they were female or darker-skinned.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in