NY Times – Auto Safety Regulator Investigating Tesla Recall of Autopilot

The National Highway Safety Administration also released an analysis of crashes involving the system that showed at least 29 fatal accidents over five and a half years.

Editors note: see also TechCrunch article with more details

See full original article by J. Edward Moreno of NY Times


The federal government’s main auto safety agency said on Friday that it was investigating Tesla’s recall of its Autopilot driver-assistance system because regulators were concerned that the company had not done enough to ensure that drivers remained attentive while using the technology.

The National Highway Traffic Safety Administration said in documents posted on its website that it was looking into Tesla’s recall in December of two million vehicles, which covered nearly all of the cars the company had manufactured in the United States since 2012. The safety agency said it had concerns about crashes that took place after the recall and results from preliminary tests of recalled vehicles.

The agency also published an analysis that found that there had been at least 29 fatal accidents involving Autopilot and a more advanced system that Tesla calls Full Self-Driving from January 2018 to August 2023. In 13 of those fatal accidents, the fronts of Teslas hit objects or people in their path.

The investigation of Tesla’s recall and the new data about crashes adds to a list of headaches for Tesla, the dominant electric-vehicle maker in the United States. The company’s sales in the first three months of the year fell more than 8 percent from a year earlier, the first such drop since the early days of the coronavirus pandemic.

Tesla announced in December that it would recall its Autopilot software after an investigation by the auto safety agency found that the carmaker hadn’t put in place enough safeguards to make sure the system, which can accelerate, brake and control cars in other ways, was used safely by drivers who were supposed to be ready at any moment to retake control of their cars using Autopilot.

In its analysis of Tesla crash data, the safety agency found that when the company’s cameras, sensors and software did not spot obstacles in the car’s path and drivers did not compensate for that failure quickly enough the consequences were often catastrophic.

In one case, a child who had just gotten off a school bus in March 2023 in North Carolina was hit by a Tesla Model Y traveling at highway speed. He had serious injuries. “Based on publicly available information, both the bus and the pedestrian would have been visible to an attentive driver and allowed the driver to avoid or minimize the severity of this crash,” the safety agency said.

It is not clear how often Tesla’s cars are involved in accidents while Autopilot and Full Self-Driving are in use, the agency said, because the company is not aware of every crash involving its cars. The safety agency added that Tesla was an outlier in the auto industry by discouraging drivers to engage with an autopilot system that isn’t equipped for many situations.

Tesla is facing several lawsuits from individuals who claim that the system is defective, and that its design contributed to or is responsible for serious injuries and deaths.

The December recall, which entails a wireless software update, includes more prominent visual alerts and checks when drivers are using Autopilot to remind them to keep their hands on the wheel and pay attention to the road. The recall covers all five of Tesla’s passenger models — the 3, S, X, Y and Cybertruck.

Tesla did not respond to a request for comment.

The auto safety agency also said Friday that it took issue with Tesla’s decision to allow customers to opt in to the recall and let them undo the changes. Tesla also appeared to include other updates that addressed issues related to the recall that the company and the safety agency had not agreed on in advance.

Tesla and its chief executive, Elon Musk, have long chafed at criticism of Autopilot and Full Self-Driving. They have argued that the systems, neither of which allow cars to drive themselves, make its cars safer and have blamed drivers for any crashes or problems.

The carmaker has been under the scrutiny of safety regulators for other issues, too.

Last week, the auto safety agency said Tesla had agreed to recall nearly 4,000 Cybertruck pickups. The agency said the way soap had been used as a lubricant during the assembly of the truck could lead to the accelerator pedal’s becoming stuck. The carmaker is not aware of any injuries or accidents linked to that defect.

In February, Tesla recalled more than two million vehicles because the font size on a warning lights panel was too small.

The company is struggling to hold on to its dominance in the electric-vehicle market as newer and more established automakers introduce new models around the world. Tesla’s market share in the U.S. electric-vehicle market fell to 51 percent in the first quarter, down from 62 percent a year earlier.

Mr. Musk told employees this month that Tesla would cut more than 10 percent of its work force. Two senior executives also announced that they were leaving the company.


See full original article by J. Edward Moreno of NY Times

Leave a Reply

Your email address will not be published. Required fields are marked *