How Do We Deal With Accidents Involving Autonomous Vehicles?

Jodie Cai
The Black Box
Published in
3 min readNov 25, 2020

It is no secret that autonomous vehicles are the future of our transportation. Companies like Tesla have already begun advancement towards a world of full automation. However, with this daunting new technology comes the heavily debated question, who do we blame when the robot causes an accident?

Photo by Tesla

To answer this question, we need to first understand the system of automation. According to SAE J3016, there are “six levels of driving automation, from SAE Level Zero (no automation) to SAE Level 5 (full vehicle autonomy).” As of now, companies are selling vehicles in levels 1 and 2; where the human driver is still present and in control of the vehicle. Marketing of vehicles in levels 3 and 4 should be expected in a few years. However, vehicles in level 5, meaning a completely automated system that conducts the driving and monitors the driving environment, is still a ways away.

Photo by Nicole Schaub, Arizona Republic

In 2018, Uber made headlines when one of its autonomous vehicles was involved in a fatal accident. It was around 10:00 in the night on March 18th when Elaine Herzberg was struck and killed by an automated Uber in Tempe, Arizona. The debate that followed changed the world of autonomous vehicles. While it is easy to simply draw to the conclusion that the robot should have seen Herzberg, we must take the other factors into account. For one, it was 10:00 PM, meaning it was fairly dark out. In addition, Elaine Herzberg was carrying a bike, likely confusing the automation. Multiple reports also suggest that Herzberg did not cross at a proper crosswalk but rather seemed to, “melt out of the shadows.” With all of this considered, who is to blame for the accident?

Research has shown that at our current stage of self-driving technology, the driver is still generally responsible for any accidents that occur while they are behind the wheel. In the case of Elaine Herzberg, the driver was ultimately listed as the primary cause of the accident because she was watching “The Voice” on her phone instead of paying attention to the road. The Uber was a test vehicle meant to be monitored by the operator, hence, much of the blame landed on her.

The 2018 accident brought up many tough questions on the rules and regulations for testing autonomous vehicles. When automation fails us, we are understandably caught by surprise. After all, these are the machines we built in hope of safer roads, and we are confounded when they go out and do the exact opposite.

Sources:

Jurdak Research Group Leader, Raja, and Salil S. Kanhere Associate professor. “Who’s to Blame When Driverless Cars Have an Accident?” The Conversation, 11 Aug. 2019, theconversation.com/whos-to-blame-when-driverless-cars-have-an-accident-93132.

Shuttleworth, Jennifer. “SAE J3016 Automated-Driving Graphic.” SAE International ®, 15 May 2020, www.sae.org/news/2019/01/sae-updates-j3016-automated-driving-graphic.

Apr 06, 2018, Law and Public Policy Podcasts Wharton Business Daily North America. “Autonomous Car Crashes: Who — or What — Is to Blame?” Knowledge@Wharton, 2018, knowledge.wharton.upenn.edu/article/automated-car-accidents/.

Randazzo, Ryan. “Driver Mostly to Blame for Fatal Self-Driving Uber Crash, Federal Safety Board Finds.” The Arizona Republic, The Republic | Azcentral.com, 20 Nov. 2019, www.azcentral.com/story/money/business/tech/2019/11/19/driver-fatal-arizona-uber-crash-mostly-blame-ntsb-report-finds/4232936002/.

--

--