Rolling Stone – Elon Musk’s Big Lie About Tesla Is Finally Exposed
See full article in Rolling Stone by Ed Niedermeyer
More than 2 million of the cars are being recalled — because Tesla’s “self-driving” systems have always been anything but
BACK IN 2016, Elon Musk claimed that Tesla cars could “drive autonomously with greater safety than a person. Right now.” It was a lie, one that sent Tesla’s stock price soaring — and made Musk among the wealthiest people on the planet. That lie is now falling apart in the face of a new recall of 2 million Teslas. It’s also revealing to the broader public what close observers of Tesla have always known (and the company itself admits in the fine print of its legal agreements): Tesla’s so-called “self driving” technology works fine — as long as there’s a human behind the wheel, alert at all times.
Out of all the scandals over the last decade or so of venture capital-fueled excess, Tesla’s dangerous and hype-happy approach to driving automation technology has been one of the most important but also one of the most hidden in plain sight. Just like the Mechanical Turk of 1770, everyone has been so focused on the technology itself that they’ve missed the human factors that power the entire spectacle. Just as worryingly, regulators have missed that forcing humans to babysit incomplete systems introduces entirely new risks to public roads.
If you read the official notice for Tesla’s recall of more than two million vehicles equipped with Autopilot, the thing that jumps out is that it’s not really about a defect in the Autopilot technology itself. At least not in the sense that the system’s cameras are breaking, or its software is seeing red lights as green lights, or its AI is making disturbing choices in “trolley problem” exercises or anything like that. The problem, strangely enough, has everything to do with humans.
Humans, the regulatory technobabble reveals, do the strangest things sometimes. It turns out that when a human uses a “driving assistance” system that steers, brakes and accelerates for them, sometimes they stop paying attention to the road. This wouldn’t be a problem if Teslas could actually drive themselves safely, and the company took legal liability for the actions its software makes when it navigates 5,000 pound vehicles on public roads. But because none of those things is true, users must be poised to rescue Autopilot from itself at any moment, or face having it drive them into an object at high speed–perhaps a semi truck turning across their lane–as has happened on several occasions.
See full article in a Rolling Stone