Tesla’s autopilot system is far more dangerous than previously reported. In June 2019, regulators reported only 217 crashes of the driverless-assistance system with three fatalities.
But a recent study by the Washington Post shows that since 2019, there have been 736 crashes directly linked to Autopilot, including 17 fatalities. These statistics have raised questions about how safe the driverless system is.
Tesla’s creator, Elon Musk, insists the system is safe and that he had a “moral obligation” to deploy it.
“At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it even though you’re going to get sued and blamed by a lot of people,” Musk said last year. “Because the people whose lives you saved don’t know that their lives were saved. And the people who do occasionally die or get injured, they definitely know — or their state does.”
But former NHTSA senior safety adviser Missy Cummings, a professor at George Mason University’s College of Engineering and Computing, isn’t so sure.
“Tesla is having more severe — and fatal — crashes than people in a normal data set,” she said in response to the figures analyzed by the Post. She said that one likely cause is the expanded rollout over the past year and a half of Full Self-Driving. This brings driver assistance to city and residential streets. “The fact that … anybody and everybody can have it. … Is it reasonable to expect that might be leading to increased accident rates? Sure, absolutely.”
Autopilot, which Tesla introduced in 2014, is a suite of features that enable the car to maneuver itself from highway on-ramp to off-ramp, maintaining speed and distance behind other vehicles and following lane lines. Tesla offers it as a standard feature on its vehicles, of which more than 800,000 are equipped with Autopilot on U.S. roads, though advanced iterations come at a cost.
Full Self-Driving, an experimental feature that customers must purchase, allows Teslas to maneuver from point A to B by following turn-by-turn directions along a route, halting for stop signs and traffic lights, making turns and lane changes, and responding to hazards along the way. With either system, Tesla says drivers must monitor the road and intervene when necessary.
The number of “full self-driving” users has skyrocketed from about 12,000 to 400,000 in a little more than a year. That alone could be one of the major causes of more crashes.
Philip Koopman, a Carnegie Mellon University professor who has conducted research on autonomous vehicle safety for 25 years, thinks that the fact that so many of the crashes involved Tesla and not Suburu or other vehicles sold in the U.S. is crucial.
“A significantly higher number certainly is a cause for concern,” he said. “We need to understand if it’s due to actually worse crashes or if there’s some other factor such as a dramatically larger number of miles being driven with Autopilot on.”
In February, Tesla issued a recall of more than 360,000 vehicles equipped with Full Self-Driving over concerns that the software prompted its vehicles to disobey traffic lights, stop signs and speed limit.
The flouting of traffic laws, documents posted by the safety agency said, “could increase the risk of a collision if the driver does not intervene.” Tesla said it remedied the issues with an over-the-air software update, remotely addressing the risk.
Our indefatigable Transportation Secretary Pete Buttigieg thinks that Musk needs to change the name of “autopilot.”
“I don’t think that something should be called, for example, an Autopilot, when the fine print says you need to have your hands on the wheel and eyes on the road at all times,” Buttigieg said in an interview with the Associated Press.
People are getting killed, Teslas are driving all over the road like drunken teenagers, and our brave secretary wants to change the name of the driverless assistance system.
That actually sounds like a job tailor-made for Buttigieg.
Join the conversation as a VIP Member