U.S. REGULATORS TIE TESLA’S AUTOPILOT TO MORE THAN A DOZEN FATALITIES, HUNDREDS OF CRASHES

Federal auto-safety regulators have opened an investigation into the adequacy of Tesla’s December recall of 2 million vehicles equipped with Autopilot software, tying the technology to at least 14 fatalities, several dozen injuries and hundreds of crashes.

The National Highway Traffic Safety Administration said in a report published Friday that its examination of Tesla’s Autopilot, a driver-assist system that automates some driving tasks, uncovered a trend of “avoidable crashes involving hazards that would have been visible to an attentive driver.”

NHTSA, the auto industry’s top regulator, shed more light on the safety concerns that have long swirled around Autopilot, providing new information on fatalities and injuries linked to drivers’ use of the controversial feature, which is widely available on Tesla models.

The new probe also marks an escalation of NHTSA’s long-running scrutiny of the technology, which it had been investigating for nearly three years after numerous high-profile crashes, including ones that killed drivers.

On Friday, the regulator said it was closing its earlier probe and opening the new one into the adequacy of the recall remedy, which was deployed through a software update. The recall in December was among Tesla’s largest to date and involved nearly all the vehicles it had sold in the U.S.

In all, NHTSA said it had identified 467 crashes involving Autopilot.

The regulator said its Office of Defects Investigation had discovered crashes that occurred after the recall and results from preliminary tests the agency performed on remedied vehicles.

Tesla issued the recall to address a previous NHTSA investigation into whether the Autopilot program contained a defect that created an unreasonable risk to vehicle safety.

The resulting recall affected more than 2 million Model Y, X, S, 3 and Cybertruck vehicles made since 2012 that were equipped with Autopilot. In updating the vehicle software, Tesla said it had installed new safeguards to prevent driver misuse.

Tesla didn’t respond to a request for comment.

NHTSA offered more details on Friday about the work that led to the recall, saying its effort “showed evidence that Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.”

This mismatch resulted in a “critical safety gap” between drivers’ expectations of Autopilot’s safety and the system’s actual capabilities, it added.

NHTSA also compared Autopilot to similar systems deployed by auto-industry rivals, saying it found Tesla’s approach was an “industry outlier.” The agency said the Autopilot name “elicits the idea of drivers not being in control,” while other systems use terms like “assist” or “team” to imply that active supervision is required.

In its latest report, NHTSA took issue with Tesla’s statements that a portion of the recall remedy required opt-in from the owner and could be reversed at the driver’s discretion. It also said some Tesla updates appeared to address Autopilot issues that the NHTSA raised without identifying them as remedies.

“This investigation will consider why these updates were not a part of the recall or otherwise determined to remedy a defect that poses an unreasonable safety risk,” the agency said.

Since the Autopilot update was rolled out, regulators have received an unusually high number of complaints about changes made to the controls. Some drivers say warnings have become excessive and are triggered when performing routine tasks.

Tesla has been beset with safety setbacks recently. Earlier this year, Tesla issued a recall for nearly all of its electric vehicles sold in the U.S. due to the font for some visual warning lights being too small.

Several other government agencies have opened their own investigations into Tesla’s Autopilot. The Justice Department and Securities and Exchange Commission, in particular, are examining whether Tesla misled customers and investors with marketing that overstated the technology’s capabilities, giving drivers a false impression of what it can realistically do.

Multiple lawsuits over crashes in which Autopilot was engaged also are pending. This month, the electric-car maker reached a settlement with the family of a driver who died in a 2018 crash.

In that case, Walter Huang, a 38-year-old Apple engineer, died on Highway 101 in California after his Model X sport-utility vehicle crashed into a highway barrier while he was using Autopilot. Terms of the settlement were not disclosed.

Write to Ryan Felton at [email protected] and Dean Seal at [email protected]

2024-04-26T11:59:20Z dg43tfdfdgfd