Wife of Tesla recruiter who was 3x drink-drive limit when autopilot car crashed in fireball sues company
Hans Von Ohain died after the car hit a tree and burst into flames, but a passenger was able to escape, the suit says
Your support helps us to tell the story
This election is still a dead heat, according to most polls. In a fight with such wafer-thin margins, we need reporters on the ground talking to the people Trump and Harris are courting. Your support allows us to keep sending journalists to the story.
The Independent is trusted by 27 million Americans from across the entire political spectrum every month. Unlike many other quality news outlets, we choose not to lock you out of our reporting and analysis with paywalls. But quality journalism must still be paid for.
Help us keep bring these critical stories to light. Your support makes all the difference.
The widow of a man who died after his Tesla veered off the road and crashed into a tree while he was using its partially automated driving system is suing the carmaker, claiming its marketing of the technology is dangerously misleading.
The Autopilot system prevented Hans Von Ohain from being able to keep his Model 3 Tesla on a Colorado road in 2022, according to the lawsuit filed by Nora Bass in state court on May 3. Von Ohain died after the car hit a tree and burst into flames, but a passenger was able to escape, the suit says.
Von Ohain was intoxicated at the time of the crash, according to a Colorado State Patrol report. He was three times the legal limit according to analysis of his blood-alcohol level in the autopsy report, according to the Denver Post.
The Associated Press sent an email to Tesla's communications department seeking comment on Friday.
Tesla offers two partially automated systems, Autopilot and a more sophisticated “Full Self Driving,” but the company says neither can drive itself, despite their names.
The lawsuit, which was also filed on behalf of the only child of Von Ohain and Bass, alleges that Tesla, facing financial pressures, released its Autopilot system before it was ready to be used in the real world. It also claims the company has had a “reckless disregard for consumer safety and truth," citing a 2016 promotional video.
“By showcasing a Tesla vehicle navigating traffic without any hands on the steering wheel, Tesla irresponsibly misled consumers into believing that their vehicles possessed capabilities far beyond reality,” it said of the video.
Last month, Tesla paid an undisclosed amount of money to settle a separate lawsuit that made similar claims, brought by the family of a Silicon Valley engineer who died in a 2018 crash while using Autopilot. Walter Huang's Model X veered out of its lane and began to accelerate before barreling into a concrete barrier located at an intersection on a busy highway in Mountain View, California.
Evidence indicated that Huang was playing a video game on his iPhone when he crashed into the barrier on March 23, 2018. But his family claimed Autopilot was promoted in a way that caused vehicle owners to believe they didn’t have to remain vigilant while they were behind the wheel.
U.S. auto safety regulators pressured Tesla into recalling more than 2 million vehicles in December to fix a defective system that’s supposed to make sure drivers pay attention when using Autopilot.
In a letter to Tesla posted on the agency’s website this week, U.S. National Highway Traffic Safety Administration investigators wrote that they could not find any difference in the warning software issued after the recall and the software that existed before it. The agency says Tesla has reported 20 more crashes involving Autopilot since the recall.
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.