Tesla recalls 2 million vehicles to limit use of Autopilot feature
The recall was disclosed in a letter to Tesla posted by the National Highway Traffic Safety Administration, which said that Tesla had agreed to an over-the-air software update starting on December 12 that will limit the use of the Autosteer feature if a driver repeatedly fails to demonstrate he or she is ready to resume control of the car while the feature is on, CNN reports.
Tesla has been pushing its driver-assist features, including Autopilot and what it calls “Full Self Driving,” which Tesla has insisted make driving safer than cars operated exclusively by humans. But NHTSA has been studying reports of accidents involving Autopilot and its Autosteer function for more than two years.
The recall comes two days after a detailed investigation was published by the Washington Post that found at least eight serious accidents, including some fatalities, in which the Autopilot feature should not have been engaged in the first place.
Tesla’s owners manuals say: “Autosteer is intended for use only on highways and limited-access roads with a fully attentive driver.” But the company has pushed the idea that its driver assist features allow the cars to safely make most driving decisions even away from those roads.
A NHTSA investigation, however, has found numerous accidents over the past several years that suggest that these features do not live up to their names of Autopilot and Full Self Driving.
The safety regulator in its letter to Tesla said “in certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse [of the feature.]” It said that when drivers are not fully engaged and ready to take control of the car “there may be an increased risk of a crash.”
In addition to the software updates, Tesla will mail letters to car owners notifying them of the change.
A history of Autopilot issues
This is not the first time that NHTSA has pushed Tesla to make changes to its Autopilot or Full Self Driving features after finding the features posed safety problems.
In February, Tesla recalled all 363,000 US vehicles then on the road with its FSD feature after finding cars operating with the feature would violate traffic laws, including “traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution.”
And NHTSA and the National Transportation Safety Board have been investigating crashes involving Tesla vehicles using the various driver assist features, including a series of crashes into emergency vehicles on the scene of other accidents.
Tesla is not the only automaker offering driver assist features marketed as “self-driving.” And it is not the only one to run into safety problems. Recently General Motors’ Cruise unit suspended its driverless taxi service nationwide after California authorities suspended its ability to operate the system there after an accident.
But, because it markets the names Autopilot and Full Self Driving, Tesla has made a greater emphasis than competitors on self-driving. It charges buyers $6,000 for cars with what it calls “enhanced Autopilot.” and $12,000 for the FSD feature.
Many who paid extra for those features have told CNN they think the features are not worth the extra money. But while the features have found support among other owners, the reports of serious accidents and deaths by police and safety regulators could hurt Tesla’s efforts to market the cars and their expensive features.