The Independent's journalism is supported by our readers. When you purchase through links on our site, we may earn commission. 

Self-driving Uber software behaved like a human before crashing, suggests eye-witness

'The other person just wanted to beat the light and kept going'

Aatif Sulleyman
Thursday 30 March 2017 13:34 BST
Comments
Uber’s self-driving cars have previous when it comes to questionable traffic light conduct
Uber’s self-driving cars have previous when it comes to questionable traffic light conduct (FRESCO NEWS/Mark Beach/Handout via REUTERS)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

New details about the recent Arizona car crash involving a self-driving Uber have emerged, and suggest that the vehicle’s software may have made a risky decision seconds before the collision.

According to a police report, the self-driving Volvo had been travelling along a wide boulevard with a 40mph speed limit.

It was in self-driving mode at the time and was carrying two ‘safety’ drivers, who say it was travelling at 38mph.

The traffic lights that the Uber car was approaching turned yellow as it entered an intersection, where a Honda on the other side of the road made a left turn.

The two vehicles collided, and the Uber was flipped onto its side.

The police report states that the driver of the Honda hadn’t seen the oncoming Uber, and Patrick Murphy, one of the people in the Uber, said a blind spot caused by traffic meant there was no time to react.

Police have said that the Uber car was not at fault, but an eye-witness claims otherwise.

“It was the other driver's fault for trying to beat the light and hitting the gas so hard,” Brayan Torres told police in a statement, reports Bloomberg. “The other person just wanted to beat the light and kept going.”

Such accounts aren’t always reliable, but Uber’s self-driving cars have previous when it comes to questionable traffic light conduct.

One of them ran a red light in San Francisco last year, an incident that Uber blamed on human error, though two employees said it had been in self-driving mode at the time.

The Arizona incident raises questions about how Uber's software reacted to the traffic lights.

The thought of its sensors failing to register the changing signals is frightening enough, but the possibility that it chose to speed up to avoid waiting at a red light is far more worrying.

According to a New York Times report from February, Uber’s driverless cars have “failed to recognise” six sets of traffic lights during San Francisco tests.

The company is currently trialling its system in Arizona, Pennsylvania and California.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in