Tesla is recalling nearly all vehicles sold in the US, totalling more than 2 million, to update software and fix a defective system that is supposed to ensure that drivers are paying attention when using Autopilot.
Documents posted by US safety regulators say the update will increase warnings and alerts to drivers and even limit the areas where basic versions of Autopilot can operate.
The recall comes after a two-year investigation by the National Highway Traffic Safety Administration (NHTSA) into a series of collisions that happened while the Autopilot partially automated driving system was in use. Some were deadly.
The agency says its investigation found Autopilot’s method of making sure that drivers are paying attention can be inadequate and can lead to “foreseeable misuse of the system”.
But safety experts said while the recall is a good step, it still makes the driver responsible and does not fix the underlying problem that Tesla’s automated systems have trouble spotting and stopping for obstacles in their path.
The recall covers models Y, S, 3 and X produced between October 5 2012, and December 7 of this year.
The update was to be sent to certain affected vehicles on Tuesday, with the rest getting it later.
Autopilot includes features called Autosteer and Traffic Aware Cruise Control, with Autosteer intended for use on limited access freeways when it is not operating with a more sophisticated feature called Autosteer on City Streets.
The software update will limit where Autosteer can be used.
“If the driver attempts to engage Autosteer when conditions are not met for engagement, the feature will alert the driver it is unavailable through visual and audible alerts, and Autosteer will not engage,” the recall documents said.
Depending on a Tesla’s hardware, the added controls include “increasing prominence” of visual alerts, simplifying how Autosteer is turned on and off, and additional checks on whether Autosteer is being used outside of controlled access roads and when approaching traffic control devices.
A driver could be suspended from using Autosteer if they repeatedly fail “to demonstrate continuous and sustained driving responsibility,” the documents say.
According to recall documents, agency investigators met with Tesla starting in October to explain “tentative conclusions” about fixing the monitoring system.
Tesla did not concur with NHTSA’s analysis but agreed to the recall on December 5 in an effort to resolve the investigation.
Auto safety advocates have been calling for stronger regulation of the driver monitoring system, which mainly detects whether a driver’s hands are on the steering wheel, for years.
The compromise is disappointing because it does not fix the problem that the older cars do not have adequate hardware for driver monitoring
They have also called for cameras to make sure a driver is paying attention, which are used by other automakers with similar systems.Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University who studies autonomous vehicle safety, called the software update a compromise that does not address a lack of night vision cameras to watch drivers’ eyes, as well as Tesla’s failure to spot and stop for obstacles.
“The compromise is disappointing because it does not fix the problem that the older cars do not have adequate hardware for driver monitoring,” Prof Koopman said.
In its statement, NHTSA said the investigation remains open “as we monitor the efficacy of Tesla’s remedies and continue to work with the automaker to ensure the highest level of safety”.
Autopilot can steer, accelerate and brake automatically in its lane, but is a driver-assist system and cannot drive itself despite its name.
Independent tests have found that the monitoring system is easy to fool, so much so that drivers have been caught while driving drunk or even sitting in the back seat.
In its defect report filed with the safety agency, Tesla said Autopilot’s controls “may not be sufficient to prevent driver misuse”.
In a statement posted on X, formerly Twitter, on Monday, Tesla said safety is stronger when Autopilot is engaged.
NHTSA has dispatched investigators to 35 Tesla crashes since 2016 in which the agency suspects the vehicles were running on an automated system. At least 17 people have been killed.
The investigations are part of a larger probe by the NHTSA into multiple instances of Teslas using Autopilot crashing into emergency vehicles.
NHTSA has become more aggressive in pursuing safety problems with Teslas, including a recall of Full Self Driving software.
In May, Transportation Secretary Pete Buttigieg, whose department includes NHTSA, said Tesla should not be calling the system Autopilot because it cannot drive itself.
Tesla has yet to respond to requests for comment.
The best videos delivered daily
Watch the stories that matter, right from your inbox