Washington Post – Tesla worker killed in fiery crash may be first ‘Full Self-Driving’ fatality

Editors note: a repeated failure in how a system is used is a failure in the system, not with the users.

Original article by Trisha ThadaniFaiz SiddiquiRachel Lerman, Whitney ShefteJulia Wall and Talia Trackim from Washington Post


EVERGREEN, Colo.

Hans von Ohain and Erik Rossiter were on their way to play golf one afternoon in 2022 when von Ohain’s Tesla suddenly swerved off Upper Bear Creek Road. The car’s driver-assistance software, Full Self-Driving, was struggling to navigate the mountain curves, forcing von Ohain repeatedly to yank it back on course.

“The first time it happened, I was like, ‘Is that normal?’” recalled Rossiter, who described the five-mile drive on the outskirts of Denver as “uncomfortable.” “And he was like, ‘Yeah, that happens every now and then.’”

Hours later, on the way home, the Tesla Model 3 barreled into a tree and exploded in flames, killing von Ohain, a Tesla employee and devoted fan of CEO Elon Musk. Rossiter, who survived the crash, told emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road,” according to a 911 dispatch recording obtained by The Washington Post. In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology.

Tesla owners have long complained of occasionally erratic behavior by the cars’ software, including sudden braking, missed road markings and crashes with parked emergency vehicles. Since federal regulators began requiring automakers to report crashes involving driver-assistance systems in 2021, they have logged more than 900 in Teslas. A Post analysis found at least 40 crashes that resulted in serious or fatal injuries.

Most involved Autopilot, which is designed for use on controlled-access highways. No fatal crash has been definitively linked to the more sophisticated Full Self-Driving, which is programmed to guide the car almost anywhere, from quiet suburban roads to busy city streets.

Two years ago, a Tesla shareholder tweeted that there “has not been one accident or injury” involving Full Self-Driving, to which Musk responded: “Correct.” But if that was accurate at the time, it no longer appears to be so. A Tesla driver who caused an eight-car pileup with multiple injuries on the San Francisco-Oakland Bay Bridge in 2022 told police he was using Full Self-Driving. And The Post has linked the technology to at least two serious crashes, including the one that killed von Ohain.

[The final 11 seconds of a fatal Tesla Autopilot crash]

Von Ohain and Rossiter had been drinking, and an autopsy found that von Ohain died with a blood alcohol level of 0.26 — more than three times the legal limit —a level of intoxication that would have hampered his ability to maintain control of the car, experts said. Still, an investigation by the Colorado State Patrol went beyond drunken driving, seeking to understand what role the Tesla software may have played in the crash.

The question is critical as automakers race toward the promise of a driverless future. For private vehicles, that day is far from here. But critics say features like Full Self-Driving already are giving drivers a false sense of confidence about taking their eyes off the road — or getting behind the wheel after drinking — evincing the dangers of letting consumers test an evolving, experimental technology on the open road.

Tesladid not respond to multiple requests for comment. The company, which has released Full Self-Driving to about 400,000 customers, acknowledges that the software is in “beta” mode — meaning still in development, constantly learning and being modified. But Tesla argues that its public release is an essential step toward reducing America’s 40,000 annual road deaths. “The more automation technology offered to support the driver, the safer the driver and other road users,” Tesla tweeted in December.

At the same time, Tesla user manuals cite a long list of conditions under which Full Self-Driving may not function properly, including narrow roads with oncoming cars and curvy roads. The company has long maintained that drivers must control their cars and that Tesla is not liable for distracted or drunken driving.

[Tesla drivers run Autopilot where it’s not intended — with deadly consequences]

Multiple lawsuits have begun challenging the view that drivers are solely responsible when Tesla’s software allegedly causes crashes or fails to prevent them.So far, Tesla has prevailed. Last fall, a California jury found Tesla not liable for a 2019 Autopilot crash in which survivors said the car suddenly veered off the road. At least nine more cases are expected to go to trial this year.

Von Ohain’s widow, Nora Bass, said she has been unable to find a lawyer willing to take his case to court because he was legally intoxicated. Nonetheless, she said, Tesla should take at least some responsibility for her husband’s death.

“Regardless of how drunk Hans was, Musk has claimed that this car can drive itself and is essentially better than a human,” Bass said. “We were sold a false sense of security.”

Von Ohain used Full Self-Driving nearly every time he got behind the wheel, Bass said, placing him among legions of Tesla boosters heeding Musk’s call to generate data and build the technology’s mastery. While Bass refused to use the feature herself — she said its unpredictability stressed her out — her husband was so confident in all it promised that he even used it with their baby in the car.

“It was jerky, but we were like, that comes with the territory of” new technology, Bass said. “We knew the technology had to learn, and we were willing to be part of that.”


Von Ohain, a Marine veteran originally from Cincinnati, joined Tesla in late 2020 as a recruiter for engineers, attracted to the company’s mission of bringing electric and autonomous vehicles to the masses, Bass said. He also was inspired by the idea of working for Musk, she said — a “brilliant man” who built a company that promised to save lives and make the roadways safer.

Von Ohain “had this opportunity to be part of a company that is working on insanely advanced technology, and we had always thought Elon Musk was interesting,” Bass said. “Hans was so interested in brilliant minds.”

At the time, Tesla had just introduced Full Self-Driving, and would eventually release it to a wider group of owners who had been monitored by the carmaker and declared safe drivers. Like many Tesla employees, von Ohain received the feature — then a $10,000 option — free with his employee discount, according to Bass and a purchase order reviewed by The Post.

Though still in its beta phase, the technology is “the difference between Tesla being worth a lot of money and being worth basically zero,” Musk has said, noting his customers’ — and investors’ — enthusiasm for a fully autonomous car. Many major automakers were developing advanced driver-assistance technology, but Tesla was more aggressive in pushing sophisticated features out to an eager public.

For years, Musk had preached the benefits of pursuing autonomous driving. In 2019, he predicted that it would one day be so reliable that drivers “could go to sleep” — though, for now, Tesla’s user agreement requires the driver to stay engaged and ready to take over from Full Self-Driving at all times.

In 2022, Tesla recalled more than 50,000 vehicles amid concerns that Full Self-Driving caused the car to roll through stop signs without coming to a full halt. Even as Musk tweeted months later that Tesla had made Full Self-Driving Beta available to anyone in North America who bought it, complaints continued to pile up: Drivers reported that cars would stop short, blow through stop signs or suddenly veer off the road when lane markings were unclear.

“We test as much as possible in simulation and with [quality assurance]drivers, but reality is vastly more complex,” Musk tweeted last spring about a new version of the software. Tesla employees would get it first, he said, with wider release to come “as confidence grows.”

[Tesla conducts largest-ever recall for ‘insufficient’ safety controls after exclusive Post report on Autopilot]

On the day of the crash, von Ohain and Rossiter played 21 holes of golf, downing multiple drinks along the way. Though an autopsy would later show that von Ohain was legally drunk, Rossiter said he seemed composed and “by no means intoxicated” as they got in the Tesla and headed home.

Rossiter, who was found to have a similar blood alcohol level, can recall only shreds of the crash: A bright orange glow. Careening off the road. Jumping out of the car and trying to pull his friend out. The driver’s-side door blocked by a fallen tree.

As Rossiter yelled for help on the deserted mountain road, he remembers, his friend was screaming inside the burning car.

Colorado State Patrol Sgt. Robert Madden, who oversaw the agency’s investigation, said it was one of “the most intense” vehicle fires he had ever seen. Fueled by thousands of lithium-ion battery cells in the car’s undercarriage, according to the investigation report, the fire is what killed von Ohain: His cause of death was listed as “smoke inhalation and thermal injuries.” Madden said he probably would have survived the impact alone.

At the scene of the crash, Madden said, he found “rolling tire marks,” meaning the motor continued to feed power to the wheels after impact. There were no skid marks, Madden said, meaning von Ohain appeared not to have hit the brakes.

“Given the crash dynamics and how the vehicle drove off the road with no evidence of a sudden maneuver, that fits with the [driver-assistance] feature” being engaged, Madden said.

Colorado police were unable to access data from the car because of the intensity of the fire, according to the investigation report, and Tesla said it could not confirm that a driver-assistance system had been in use because it “did not receive data over-the-air for this incident.” Madden said the remote location may have hindered communications.

However, Tesla did report the crash to the National Highway Traffic Safety Administration. According to NHTSA, Tesla received notification of the crash through an unspecified “complaint” and alerted federal authorities that a driver-assistance feature had been in use at least 30 seconds before impact. Because of the extensive fire damage, NHTSA could not confirm whether it was Full Self-Driving or Autopilot.

In December, Tesla acknowledged problems with driver inattention, issuing a recall for nearly all of its 2 million U.S. cars to add more-frequent alerts. Bass said that von Ohainknew he needed to pay attention but that his focus naturally flagged with Full Self-Driving.

“You’re told that this car should be smarter than you, so when it’s in Full Self-Driving, you relax,” she said. “Your reaction time is going to be less than if we were not in Full Self-Driving.”

Alcohol also dramatically slows reaction time, and von Ohain’s intoxication probably factored heavily in the crash, said Ed Walters, who teaches autonomous vehicle law at Georgetown University. If the technology was acting up on the way to the golf course, as Rossiter claims, von Ohain should have known that he needed to remain fully alert on the drive home, Walters said.

“This driver, when sober, was able to pull the car back on the road and was able to correct for any problems in the Tesla safely,” Walters said. “People need to understand that whatever kind of car they’re driving, whatever kind of software, they need to be paying attention. They need to be sober and they need to be careful.”

Still, Andrew Maynard, a professor of advanced technology transitions at Arizona State University, said reports of the car’s frequent swerving raise questions about Tesla’s decision to release Full Self-Driving.

“The FSD technology isn’t quite ready for prime time yet,” Maynard said, adding that the value of testing the technology on the open road should be weighed against the risks of pushing it out too quickly to drivers who overestimate its capabilities.

[Why Tesla Autopilot shouldn’t be used in as many places as you think]

Nearly two years later, mangled car parts and charred battery cells are still strewn along Upper Bear Creek Road.Madden closed the Colorado State Patrol investigation, unable to determine whether Full Self-Driving played a role.

In a recent interview, Madden said he worries about the proliferation of sophisticated driver-assistance systems. “Autonomous vehicles are something of the future, and they are going to be here,” he said. “So the more we know, the more we understand, the safer we can continue into the future with this” technology.

Meanwhile, Tesla has yet to publicly acknowledge the death of an employee driving one of its cars.

To its workforce, the company has said little about what happened to von Ohain, making few efforts to console those who knew him, according to a former employee who spoke on the condition of anonymity for fear of retribution. Von Ohain’s replacement was hired within a few weeks, the person said.

“Once Hans passed away and time went by, there wasn’t any more discussion about him,” said the former employee, a member of von Ohain’s team who soon resigned.

To von Ohain’s widow, Tesla’s silence seemed almost cruel.

Though the company eventually helped cover the cost of her move back home to Ohio, Bass said, Tesla’s first communication with the family after the crash was a termination notice she found in her husband’s email.


Original article by Trisha ThadaniFaiz SiddiquiRachel Lerman, Whitney ShefteJulia Wall and Talia Trackim from Washington Post

Leave a Reply

Your email address will not be published. Required fields are marked *