Washington Post – Tesla drivers run Autopilot where it’s not intended — with deadly consequences
Editors note: a repeated failure in how a system is used is a failure in the system, not with the users.
Original article by Trisha Thadani, Faiz Siddiqui, Rachel Lerman and Jeremy B. Merrill from Washington Post
At least eight fatal or serious Tesla crashes occurred on roads where Autopilot should not have been enabled in the first place, a Post analysis finds, in spite of federal officials calling for restrictions
After a long day of fishing in Key Largo, Fla., Dillon Angulo and Naibel Benavides Leon pulled to the side of the road and hopped out of their Chevy Tahoe to look at the stars. Suddenly, Angulo said, the “whole world just fell down.”
A Tesla driving on Autopilot crashed through a T intersection at about70 mph and flung the young couple into the air, killing Benavides Leon and gravely injuring Angulo. In police body-camera footage obtained by The Washington Post, the shaken driver says he was “driving on cruise” and took his eyes off the road when he dropped his phone.
But the 2019 crash reveals a problem deeper than driver inattention. It occurred on a rural road where Tesla’s Autopilot technology was not designed to be used. Dash-cam footage captured by the Tesla and obtained exclusively by The Post shows the car blowing through a stop sign, a blinking light and five yellow signs warning that the road ends and drivers must turn left or right.
The crash is one of at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver assistance softwarecould not reliably operate, according to a Post analysis of two federal databases, legal records and other public documents. The first crash occurred in 2016, when a Tesla plowed under a semi-truck on a U.S. route in Florida.The most recent was in March when a Tesla in Autopilot failed to slow down, police said, and hit a teenager stepping off a North Carolina school bus at 45 mph.
In user manuals, legal documents and communications with federal regulators, Tesla has acknowledged that Autosteer, Autopilot’s key feature, is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” Tesla advises drivers that the technology can also falter on roads if there are hills or sharp curves, according to its user manual. Even though the company has the technical abilityto limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software.
Why Tesla Autopilot shouldn’t be used in as many places as you think
Nor have federal regulators taken action. After the 2016 crash, which killed Tesla driver Joshua Brown, the National Transportation Safety Board (NTSB) called for limits on where driver-assistance technology could be activated. But as a purely investigative agency, the NTSB has no regulatory power over Tesla. Its peer agency, the National Highway Traffic Safety Administration (NHTSA), which is part of the Department of Transportation, has the authority to establish enforceable auto safety standards — but its failure to act has given rise to an unusual and increasingly tense rift between the two agencies.
In an October interview, NTSB chair Jennifer Homendy said the 2016 crash should have spurred NHTSA to create enforceablerules around where Tesla’s technology could be activated. The inaction, she said, reflects “a real failure of the system.”
“If the manufacturer isn’t going to take safety seriously, it is up to the federal government to make sure that they are standing up for others to ensure safety,” Homendy said. But “safety does not seem to be the priority when it comes to Tesla.”
Of NHTSA, Homendy added, “How many more people have to die before you take action as an agency?”
The final 11 seconds of a fatal Tesla Autopilot crash
In a statement to The Post, NHTSA said the agency “always welcomes the NTSB’s input and carefully reviews it — especially when considering potential regulatory actions. As a public health, regulatory and safety agency, safety is our top priority.”
NHTSA said it would be too complex and resource-intensive to verify that systems such as Tesla Autopilot are used within the conditions for which they are designed, and it potentially would not fix the problem.
Homendy was skeptical of that explanation, saying agencies and industries frequently respond to NTSB recommendations by citing the impossibility of their requests — until additional carnage forces their hand.
NHTSA said it is focused instead on ensuring drivers are fully engaged while using advanced driver-assistance systems.
Tesla did not respond to a request for comment. In court cases and public statements, the company has repeatedly argued that it is not liable for crashes involving Autopilot, because the driver is ultimately responsible for the trajectory of the car. After a fatal crash in 2018, Tesla told the NTSB that design limits for Autopilot would not be appropriate because “the driver determines the acceptable operating environment.”
The string of Autopilot crashes reveals the consequences of allowing a rapidly evolving technology to operate on the nation’s roadways without significant government oversight,experts say. While NHTSA has several ongoing investigations into the company and specific crashes, critics arguethe agency’s approach is too reactive and has allowed a flawed technology to put Tesla drivers — and those around them — at risk.
The approach contrasts with federal regulation of planes and railroads, where crashes involving new technology or equipment — such as recurring issues with Boeing’s 737 Max — have resulted in sweeping action by agencies or Congress to ground planes or mandate new safety systems. Unlike planes, which are certified for airworthiness through a process called “type certification,” passenger car models are not prescreened, but are subject to a set of regulations called Federal Motor Vehicle Safety Standards, which manufacturers face the burden to meet.
NHTSA’s approach also contrasts with how some states, localities and even companies have responded to incidents involving autonomous vehicles. After a fatal crash in 2018, Uber halted its driverless program amid official scrutiny by the NTSB before offloading the program entirely. And in October, the California Department of Motor Vehicles suspended permits for driverless car company Cruise after one of its cars dragged a jaywalking pedestrianabout 20 feet after she had been hit by a regular car. Cruise also grounded its entire U.S. fleet and issued a voluntary recall.
Steven Cliff, a former NHTSA chief who left the agency last year, acknowledged the approach taken by regulators can appear too cautious at times, but said the agency was aggressive under his watch — mandating companies such as Tesla report their data on crashes involving advanced-driver assistance-systems.
But advancing from the data collection stage to a final rule, where new regulations are adopted if necessary, can take years, he said.
“Tesla’s philosophy is, let the operator determine for themselves what is safe but provide that operator a lot of flexibility to make that determination,” he said.
He said Tesla could easily limit where the technology can be deployed. “The Tesla knows where it is. It has navigation. It knows if it’s on an interstate or an area where the technology wasn’t designed to be used,” he said. “If it wasn’t designed to be used there, then why can you use it there?”
‘Operational Design Domain’
In all, The Post has identified about 40 fatal or serious crashes since 2016 involving Tesla’s driver assistance software; the bulk of them were identified through NHTSA’s data, and the rest surfaced through lawsuits. Many occurred on controlled-access highways. But at least eight occurred on roads where Autopilot was not designed to be used.
In another six crashes, it was unclear whether the Tesla driver had engaged Autopilot or the more sophisticated Full Self-Driving mode. Full Self-Driving is designed to recognize stop signs, traffic lights, cross traffic and other hazards on surface streets.
After the 2019 Key Largo crash, Angulo filed a lawsuit arguing that Tesla’s marketing creates a false sense of complacency for drivers, setting the stage for the crash that killed his girlfriend. The suit also accuses Tesla of negligence, including for the ability to activate the feature outside of its so-called “Operational Design Domain,” which is an industry term for the specific locations and sets of circumstances for which Autopilot is designed. It is one of about 10 such cases expected to go to trial over the next year.
As a result of the crash, Angulo suffered a traumatic brain injury, a broken jaw and a broken pelvic bone, among other injuries. Even today, he said, the sight of a Tesla makes him wince.
“How could they allow something like this on the road?” Angulo said in an interview. “It’s like a deadly weapon just driving around everywhere. And people, pedestrians like me, how are we supposed to know stuff like this? It shouldn’t be allowed.”
In a sworn deposition last year first detailed by Reuters and obtained by The Post, Tesla’s head of Autopilot, Ashok Elluswamy, said he wasunaware of any documentdescribing limitations on where and under what conditions the feature could operate. He said he was aware of someactivation conditions for Autopilot, including the presence of lane lines, and that it is safe for “anyone who is using the system appropriately.”
But he said he did not know what the term “Operational Design Domain” means. “I’ve heard those words before, but I do not recall much more than that,” he said.
Tesla’s commitment to driver independence and responsibility is different from some competitors, whose driver-assistance technologies are loaded with high-definition maps with rigorous levels of detail that can tip vehicles off to potential roadway hazards and obstructions. Some manufacturers, including Ford and General Motors, also only allow the technology to work on compatible roadways that have been meticulously mapped.
Tesla relies on more rudimentary maps, The Post has previously reported.And its software has been found to contain exceptions to Autopilot rules in some locations, according to a former Tesla employee, speaking on the condition of anonymity for fear of retribution. For example, the software could be activated on the Embarcadero in San Francisco, a busy and fast-moving street along the water where pedestrians and bicyclists flock,the person said.
In 2022, a company supervisor was made aware that Autopilot could be activated in locations that were not controlled-access highways, according to the former employee. The software was interpreting Kato Road in Fremont, Calif. — a public thoroughfare adjacent to Tesla’s factory — as a private road so employees could test Tesla’s “Summon” feature, which allows owners to hail their vehicles from a parking spot with no one in the driver’s seat.
Summon is supposed to be limited to private property, and the employee said he raised the matter with a supervisor. Tesla fixed the loophole soon after, the person said.
‘Sensible safeguards’
Over the years, the NTSB has repeatedly called on NHTSA to rein in Autopilot. It also has urged the company to act, but Homendy said Tesla has been uniquely difficult to deal with when it comes to safety recommendations. Tesla CEO Elon Musk once hung up on former NTSB chair Robert Sumwalt, said the former chief,who retired from the agency in 2021 when Homendy took over.
Sumwalt led the NTSB when the agency released its investigation of the first fatal crash involving Autopilot, which resulted in a 2017 report calling on automakers including Tesla to equip their driver-assistance systems with “safeguards” that limit use of the technology to “conditions for which they were designed.” It also called on NHTSA to create binding rules that limit the technology’s use.
In 2020, the NTSB issued a report on another fatal Tesla crash the prior year that cited both the semi-truck driver, who ran a stop sign, and the Tesla driver’s “over reliance” on Autopilot as probable causes of the crash. The NTSB also cited NHTSA for the first time, saying its failure to “develop a method” that would “limit the use of automated vehicle control systems to the conditions for which they were designed” contributed to the crash.
In 2021, the NTSB sent another letter to NHTSA about Autopilot, calling on the agency to “include sensible safeguards, protocols, and minimum performance standards to ensure the safety of motorists and other vulnerable road users.”
Last year, several senators joined the chorus, saying they were “deeply troubled” by the number of Autopilot crashes. They called on NHTSA to “use all its investigative and regulatory authorities to shed needed light on this out-of-control industry and impose guardrails to prevent more deadly crashes.”
NHTSA has implemented some NTSB recommendations, including increased data reporting for crashes involving autonomous and driver-assistance software. Meanwhile, NHTSA has an “active investigation” into Tesla’s Autopilot, and it issued a recall on certain Tesla models equipped with Full Self Driving after determining that it “led to an unreasonable risk to motor vehicle safety based on insufficient adherence to traffic safety laws.”
But NHTSA has not adopted any rules that would limit the technology to where it is meant to be used, despite exploring how it could ensure such software adheres to its design limits.
In one of her latest attemptsto spur action, Homendy sent a letter directly to Musk in August 2021, urging him to implement safeguards to “limit” the technology to conditions it was designed for, among other recommendations.
“If you are serious about putting safety front and center in Tesla vehicle design,” she wrote, “I invite you to complete action on the safety recommendations we issued to you four years ago.”
Musk, she said, never responded.
Original article by Trisha Thadani, Faiz Siddiqui, Rachel Lerman and Jeremy B. Merrill from Washington Post