Tesla software recall may head off fight with US regulators
Tesla has issued a recall that automatically sent a software update fixing a safety problem in its electric vehicles
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Tesla has issued a recall that automatically sent a software update fixing a safety problem in its electric vehicles, apparently heading off a looming confrontation with U.S. safety regulators.
But recall documents posted on the National Highway Traffic Safety Administration website Tuesday don't address another safety issue specified by the agency when it demanded that Tesla explain why it wasn't doing recalls for safety-related software updates done over the internet.
The recall covers nearly 12,000 Teslas with a glitch in the “Full Self-Driving” software that can make the cars stop for no good reason. The company's paperwork says the problems with automatic emergency braking can increase the risk of other vehicles hitting Teslas from behind.
The recall covers all four Tesla models — the S, X, 3 and Y. Tesla documents say a software update sent on Oct. 23 introduced the glitch.
Company documents say Tesla started getting reports from owners the next day about phantom braking. In a matter of hours, the company says it canceled further updates or reverted the software to a previous version. That disabled emergency braking on some of the vehicles.
On Oct. 24, the company traced the cause to a communication disconnect between two computer chips. It developed another software update to fix the problem and sent it out on Oct. 25, according to the documents. The company said it voluntarily agreed to do a recall on Oct. 26.
The move appears to show that Tesla now will issue a recall when it pushes out software updates to fix safety issues. It also sets a precedent for other automakers that they do the same.
On Oct. 12, regulators sent a letter to Tesla demanding to know why the company didn't recall its vehicles when it sent a software update to fix a problem with its Autopilot partially automated driving system. The update addressed detection of emergency vehicles parked on roads while crews responded to crashes.
The NHTSA opened an investigation of Autopilot in August after getting reports of a dozen crashes into emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the dozen crashes that are part of the probe, 17 people were injured and one was killed.
Tesla had until Monday to explain why it didn't issue a recall for the Autopilot update. As of early Tuesday, NHTSA had not posted any documents detailing Tesla's response.
The agency said conversations with Tesla continue “to ensure that any safety defect is promptly acknowledged and addressed according to the National Traffic and Motor Vehicle Safety Act.” The statement didn’t say if Tesla responded to the agency’s questions on the Autopilot software update.
Messages were left early Tuesday seeking comment from Tesla.
Tesla did a software update in late September that was intended to improve detection of emergency vehicle lights in low-light conditions. The agency says Tesla is aware that federal law requires automakers to do a recall if they find out that vehicles have safety defects.
Tesla says that Autopilot and “Full Self-Driving” are driver-assistance systems and cannot drive themselves, despite their names. The automaker says drivers have to be ready to intervene at any time.
____
Follow AP Auto Writer Tom Krisher at http://twitter.com/tkrisher