Tesla autopilot caused car to accelerate before fatal crash, investigators find
Preliminary report on California collision says car steered towards barrier seconds before impact
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.A Tesla car running on autopilot was accelerating as it hit a barrier on a California highway, killing the driver, a new report has found.
In a preliminary report published on Thursday, the National Transportation Safety Board (NTSB) said the vehicle failed to take evasive action as it sped towards the central reservation seconds before the crash in March.
The incident, which took place on Highway 101 in Mountain View in the heart of Silicon Valley, fatally injured the 38-year-old driver of the Model X, Walter Huang.
Investigators said during the 32-minute journey, Mr Huang, the car’s autopilot system was engaged four times, including for the final 18 minutes and 55 seconds prior to the crash.
Tesla has previously attempted to apportion blame onto the driver, stating he received several messages during the trip warning him to put his hands on the wheel and had around five seconds to react before the crash.
The NTSB’s findings confirmed Mr Huang had been given two visual and one audio alert during the trip, although the last of these was made more than 15 minutes before the collision.
Crash investigators found the car stopped following the vehicle in front and began steering left towards a barrier at the centre of the road around seven seconds before the crash.
Three seconds prior to impact, the Tesla’s speed increased from 62mph to almost 71mph, with no breaking or evasive steering detected, the report said.
It revealed the high-voltage, lithium-ion battery in the electric car was breached, causing a fire to break out as the wreckage sat by the roadside.
Five days later, the battery reignited as it sat at an impound lot, requiring firefighters to attend and extinguish the blaze.
Investigators drew the line at holding either Tesla or Mr Huang at fault for the incident in their preliminary findings, but the report will add to the scrutiny the company faces over its autopilot system.
The first fatal accident involving a Tesla engaged in autopilot took place in May 2016 in Florida, when a driver was killed as his Model S hit an 18-wheel tractor-trailer.
The NTSB found the probable cause of the crash was the truck driver’s failure to yield right of way, combined with the car driver’s inattention and over-reliance on the autopilot technology.
Last month, a Tesla Model S with autopilot engaged hit a fire truck that had stopped at a red light in South Jordan, Utah at around 60mph, without appearing to brake or avoid impact.
The driver of the car later admitted she had been looking at her phone before the crash.
Tesla owner, entrepreneur Elon Musk, has previously attacked the media over its coverage of crashes involving the firm’s vehicles, branding some reports on the Utah incident “messed up”.
“Tesla Autopilot does not prevent all accidents – such a standard would be impossible – but it makes them much less likely to occur,” the company said in a statement issued following the Mountain View crash.
“It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.
“No one knows about the accidents that didn’t happen, only the ones that did.
“The consequences of the public not using autopilot, because of an inaccurate belief that it is less safe, would be extremely severe.”
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments