Tesla on Autopilot Slammed Into (Another) Truck

Autopilot is “far less competent than a human driver”

Paris Marx
Radical Urbanist
3 min readMay 22, 2019

--

Photo by Afif Kusuma on Unsplash

In March, a Tesla Model 3 crashed into a transport truck on a Florida highway, killing its driver. The vehicle was going 68 mph in a 55 mph zone, and a preliminary report from the National Transportation Safety Board (NTSB) confirmed that Autopilot was engaged at the time of the crash and the driver’s hands were not on the wheel.

But this isn’t the first time a Tesla on Autopilot has slammed into a transport truck — it’s not even the first time it’s happened in Florida. The first reported Autopilot death was an eerily similar crash to the one that occurred in March. In May 2016, a Tesla Model S drove into a transport truck on a Florida highway, killing its driver, and the NTSB cited an overreliance on Autopilot as a contributory factor.

The problem is that Autopilot is promoted by Tesla CEO Elon Musk as being an autonomous driving system, when it’s really a driver-assist technology that’s only Level 2 on the Society of Automotive Engineers’ ranking of autonomous capabilities. That means that drivers are supposed to keep their hands on the wheel, but Musk frequently promotes the system without doing that, communicating to customers that they’ll be safe doing the same. But that’s so far gotten at least four people killed.

Previously, Musk blamed these crashes on drivers, saying “the issue is more one of complacency” by experienced users who rely too heavily on the system. Except not only is that exactly how Musk promotes Autopilot to be used, there’s also a growing recognition that it’s difficult for drivers to remain alert when technology is doing most of the driving.

Think back to the Uber crash in March 2018: the safety driver was looking at her phone because simply sitting and watching the surroundings gets really boring after a while. A second safety driver would have at least given her someone to talk to, helping each other stay alert, but Uber had cut the second driver from all its test vehicles.

What’s even more worrying is that Musk recently announced plans to roll out even more self-driving features to Tesla vehicles when he’s already far overpromising on the safety of existing features. Consumer Reports has found that Navigate on Autopilot “could create potential safety risks for drivers” and “lagged far behind a human driver’s skill set.” Law enforcement told Consumer Reports that the feature was cutting off other cars and passed in ways that violated state laws. Drivers frequently have to intervene to stop the system from making poor and sometimes unsafe decisions.

Even Electrek, an electric-vehicle news site with a history of defending Tesla and Musk against legitimate criticism, admitted that a video of Navigate on Autopilot suggesting a lane change into oncoming traffic with a semi truck heading toward the vehicle was “certainly worrying and serves as a great reminder to always stay vigilant.”

Autopilot’s growing body count needs to be a wake-up call for regulators. The claims of the tech and automotive industries on the safety of these assisted- and autonomous-driving systems can’t be blindly accepted. There needs to be independent verification of their safety before they’re allowed to be used on public roads.

Musk wants to put more than a million “robotaxis” on public roads before the end of 2020. Not only would that cripple cities, but it could lead to a lot of lost life if his faith in technology is put before the growing evidence that Autopilot isn’t as safe as he asserts. Autonomous vehicles are much farther away than Musk wants to believe, but the rest of the industry has started to admit it. Now regulators need to acknowledge it too.

--

--