Tesla Inc. filed a recall covering more than 2 million vehicles after the top US auto-safety regulator determined its driver-assistance system Autopilot doesn’t do enough to prevent misuse.
The move is the result of a years-long defect investigation by the National Highway Traffic Safety Administration that will remain open as the agency monitors the efficacy of Tesla’s fixes. A NHTSA spokesperson said the probe found that Tesla’s means for keeping drivers engaged were inadequate.
“Automated technology holds great promise for improving safety, but only when it is deployed responsibly,” NHTSA said Wednesday. “Today’s action is an example of improving automated systems by prioritizing safety.”
Tesla said in its recall report that it expected to start deploying an over-the-air software to incorporate additional controls and alerts on or shortly after Dec. 12. The carmaker’s shares fell as much as 1.6% to $233.30 before the start of regular trading.
The recall is the second this year involving Tesla’s automated-driving systems, which have come under escalating scrutiny after hundreds of crashes — some of which resulted in fatalities. While Chief Executive Officer Elon Musk has for years predicted the carmaker is on the verge of offering complete autonomy, both Autopilot and the beta features Tesla markets as Full Self-Driving require a fully attentive driver to keep their hands on the wheel.
Tesla’s Autopilot Remedies
- More prominent visual alerts
- Simplifying how drivers engage and disengage the Autopilot feature Autosteer
- Additional checks on using Autosteer outside controlled-access highways
- Eventual Autosteer use suspension if drivers repeatedly misuse the feature
Autopilot comes standard on every new Tesla. It uses cameras to match vehicle speed with surrounding traffic and assists drivers with steering within clearly marked lanes.
Tesla has marketed higher-level functionality it calls FSD Beta since late 2016. That suite of features was recalled in February, after NHTSA raised concerns about Teslas using the system traveling in unlawful or unpredictable ways, including exceeding speed limits and not coming to complete stops.
Late last year, Musk suggested on X, the social media platform formerly known as Twitter, that Tesla would update FSD Beta to give some drivers the option to disable alerts to put their hands on the steering wheel. NHTSA asked the company for more information days later.
NHTSA first conducted a defect investigation of Autopilot after a 2016 fatal crash, only to clear the system early the following year. Its two ongoing defect probes — initiated in August 2021 and February 2022 — were precipitated by Teslas crashing into first-responder vehicles and suddenly braking on highways.
The agency has opened more than 50 special crash investigations involving Tesla cars that are suspected to be linked to Autopilot, with the pace of probes picking up under the Biden administration.
Regulators scrutinizing Tesla’s driving systems go beyond NHTSA. The company disclosed in January that it had received requests for documents from the Justice Department related to Autopilot and FSD Beta. Bloomberg also reported that month that the Securities and Exchange Commission was investigating Musk’s role in shaping Tesla’s self-driving claims. – Bloomberg