When Self Driving Cars Cause Accidents — Who Is At Fault?

Vincent T.
Self-Driving Cars
Published in
4 min readMar 21, 2018

There have been recent accidents involving self-driving autonomous cars that are getting attention. There was a GM Chevy Bolt involved in an accident with a motorist in San Francisco. Another involved a Tesla Model S which a driver claimed was driving autonomously in Los Angeles which crashed into a parked fire truck on the freeway. Uber’s self-driving car was involved in an accident when it bumped a left turning car on the road in Pittsburgh. For the most part, the accidents involved did not lead to fatalities until the most recent one involving Uber in the city of Tempe, Arizona on March 19, 2018. As a result Uber has put a halt on it’s self-driving car tests. According to the video recorded from the car, it appears the victim suddenly stepped in front of the moving car which was traveling at a speed of 40 mph. It also appears that the XC90 had no intention of slowing down prior to the accident. This makes it much harder to place the blame on anyone. This will of course, be closely monitored by the rest of the autonomous vehicle self-driving car industry as it develops.

An Uber self-driving car using the Volvo XC90 SUV

This is why investigations are important, along with the data collected by Uber’s self-driving car. It now appears that the accident was unavoidable and was the pedestrian’s fault, but it could also be the self-driving car’s fault for not slowing down. The problem here is that even though there was a no crossing sign where the accident occurred, there was an accessible walkway which was an intended bike path along the road which gave the victim access. Now the pedestrian may not have been aware of the car, a Volvo XC90 SUV, was approaching. The car’s sensor probably had no idea either that the pedestrian was about to cross (sensors don’t read people’s minds). Now the unpredictability of people is something that can actually be deduced through binary logic, yet the accuracy is not always perfect i.e. security robots falling into a fountain.

The logic is pretty clear:
IF pedestrian crossing Then STOP
Else GO

Did a sub-system of Uber’s sensors fail to recognize this? Perhaps the vehicle’s software did detect the pedestrian by applying the laws of physics, it would not have enough time to react if the pedestrian suddenly crossed the vehicle’s path. Stopping from 40 mph to 0 mph requires slamming the brakes, but as we know an object in motion remains in motion until an opposite force like friction is applied to it. It would not be able to fully stop in time which would lead to hitting the pedestrian. Critics will point out that the software in self-driving cars have not yet reached the same level of awareness that human drivers have, well at least in Uber’s case (this accident only involved Uber so it is not fair to say the same for all self-driving car companies).

Uber, like many companies testing self-driving or driverless cars, should have detailed data of the accident as it occurred. During the testing phase, a camera should be recording the road test in order to present information that leads to accidents and determine who was at fault. The video footage could clear the accusations. The DMV rules for road testing driverless cars regulation requires that driverless cars have an “autonomous technology data recorder.” So at a minimum that means the testing must capture data related to speed, steering, braking, and objects detected by sensors or video cameras.

In order to navigate and detect objects in their surrounding, every driverless car company like Waymo, including Tesla, use a wide array of sensors. Uber’s is a mix of LiDAR, radar and cameras. Multiple failbacks are installed to make sure that if one component of the system fails, another can take over. If Uber is found liable to this fatality, it will impact new regulations on self-driving cars. It would lead to more stricter testing and optimizations to guarantee that all safety compliance is followed. So at this point, fully autonomous cars will not be 100% allowed on the road. They’re not ready for it quite yet. Perhaps Uber or any other self-driving car company should accept responsibility for the accidents they cause and aim to improve their systems.

This now poses the question of how to anticipate all aspects of public safety. It will be hard to make any system perfect, but let’s admit that AI and computers have less chances of making mistakes than humans.

NOTE: The timeline of this accident is March 19, 2018. More recent developments provide more details.

Suggested Reading:

--

--

Vincent T.
Self-Driving Cars

Blockchain, AI, DevOps, Cybersecurity, Software Development, Engineering, Photography, Technology