Yesterday was a big day for self-driving cars, and not in a good way.
A Tesla Model S running the Autopilot feature suffered a horrendous crash with a tractor-trailer, which resulted in the death of the Tesla’s driver. Thankfully there were no other passengers in the Tesla, and the driver of the truck was not injured.
According to the AP, the Tesla driver was Joshua Brown of Canton, Ohio. Although the accident occurred in rural north Florida, in a small town called Williston, Brown lived in Canton, Ohio, where he owned a small telecommunications company called Nexu Innovations.
Interestingly, Brown was an 11 year veteran of the Navy SEALs, although he left the service in 2008 and this doesn’t appear to have any bearing on the accident.
Tesla’s official statement on the accident calls Brown, “a friend to Tesla and the broader EV community.”
I have no connection to Joshua Brown or his family, but since I’m going to write extensively about this accident, I’d just like to take a moment to emphasize that this was a real person, who served his country, and the loss is a tragedy, beyond whatever the result is for Tesla and autonomous vehicles.
The accident itself sounds gruesome and like something out of Hollywood. Apparently the Tesla was traveling fast on a divided, but not controlled access, highway. Tractor-trailer made a left turn from the other side of the highway, across the side of the highway on which the Tesla was traveling.
At the time of impact, the trailer was basically perpendicular to the highway, and the Tesla crashed into it broadside.
What makes the accident especially gruesome is that the trailer was riding high off the ground. The bottom of the Tesla actually passed under the trailer and continued on for several hundred yards. The top half of the Tesla, starting at the windshield, was sheared off. While the official crash report isn’t out yet, it sounds like Brown might have been decapitated.
Tesla acknowledged quickly that the autopilot had been engaged at the time of the accident. According to Tesla’s statement:
Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.
There have been some reports that the movie Harry Potter was playing in the vehicle even after the crash, although these reports are as-yet disputed. If true, this would suggest Brown was not following Tesla’s instructions to pay full attention to the road, even when Autopilot is engaged.
Tesla was quick to point out that:
This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.
Tesla reported that the National Highway Transportation Safety Administration has opened a preliminary investigation into the performance of the Autopilot feature.
NHTSA itself hasn’t posted any information on this accident, so it’s hard to know what exactly is going on with this, or what the consequences might be.
A Tesla Autopilot fatality has been the inevitable nightmare that self-driving car proponents have been worried about for months. See my earlier post on Tesla’s Risk Equation.
Given the inevitability of a first fatality, though, this is just about the least-bad scenario possible.
There was only one fatality in the accident, and importantly, that fatality was the Tesla driver.
Imagine instead the accident had caused the deaths of a hypothetical family of five, riding in a minivan on the opposite side of the highway. In that case, there would be a lot of questions about whether Tesla Autopilot was endangering everyone else on the road.
Omission versus Commission
This was an accident of omission, rather than of commission. That’s surely little comfort to the family of the accident victim, but I suspect its much less worrisome to the public.
If, instead, the accident had resulted from the Autopilot driving the car into a barrier, the public perception of Autopilot might have taken a much bigger hit.
The circumstances of the accident were unusual, although not so rare as to call them unique. Nonetheless, an Autopilot accident in moderate traffic on a six-lane Interstate highway would resonate more with the public, and be more worrisome in terms of the likelihood of future accidents.
It’s impossible to know from the outside exactly why the Autopilot did not recognize the truck. Two good guesses, however, would be either the software or the sensors.
Tesla wrote that, “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.”
That explanation implicates the computer vision software, which might have had sufficient sensor data to recognize the truck, but failed to process that data correctly.
Another plausible explanation, however, is that the sensor might have been insufficient. The scenario of a high-clearance truck turning unexpectedly across a highway is unusual enough that maybe the Tesla’s radar or cameras weren’t looking for it.
Level 3 vs. Level 4
My first reaction was that this might be an example of the distinction between Level 3 and Level 4 autonomous systems. Autopilot is arguably a Level 3 system, or at least close, which means that the driver can cede control to the computer, but must be ready to take control back at any time.
Some companies, such as Google and Ford, are eschewing Level 3 systems, arguing that it’s unrealistic to expect the driver to be able to take control quickly. These companies are jumping straight to Level 4 systems, where the driver never has to take control.
However, the more I read, the less this accident appears to shed light on that issue. This appears to be a case of a straight miss on Autopilot, as opposed to a case when Autopilot unsuccessfully threw control back to the driver at the last minute.
The accident has made headlines, but it doesn’t seem like there has yet been a lot of blowback on Tesla or on self-driving cars generally. Perhaps that’s due to the mitigating factors.
Tesla’s stock is actually up for the week, indicating that Wall Street doesn’t see this as a crippling blow.
It remains to be seen whether the accident results in a costly lawsuit or settlement for Tesla, or Mobileye. Perhaps not, if the driver was a big Tesla fan, although that will now be up to his family, who may feel less generously inclined.
In the absence of a public outcry, the biggest issue might be the results of the NHTSA report. If NHTSA merely makes some recommendations about improving Autopilot in certain scenarios and ensuring that drivers pay attention to the road, that will be a win.
If the NHTSA investigation causes Tesla and other companies to significantly scale back their autonomous vehicle efforts, that would be a game-changer.