Report Suggests Uber Strongly to Blame for Pedestrian’s Death

Wisner Baum
8 min readMar 14, 2019

--

With Waymo at least a year or two ahead of its competition with its self-driving taxi service, some competitors are looking to catch up with the autonomous car company. In the rush to get self-driving vehicles on the road — and to be one of the leaders in the technology, rather than a follower — are some companies cutting corners or making decisions that jeopardize people’s lives? It happens time and again in other industries and a recent report suggests that such negligence may be happening in the autonomous vehicle industry as well.

An investigation by Business Insider suggests Uber officials may have made decisions that tragically resulted in the death of a pedestrian. While Uber leadership initially tried to blame everyone else — including the victim and the backup driver — some say Uber’s own policies created a situation that led to the tragedy. In their quest to catch up with competitors, or at least to not fall further behind — Uber personnel made decisions that likely had fatal consequences.

Employees recently spoke out about how the desire to create a smooth ride, keep Uber’s autonomous vehicle division open, and impress the company’s new CEO may have trumped safety considerations. They further say that dysfunction within the company and a lack of communication between teams in the division led to vital warning signs being missed.

Elaine Herzberg Becomes First Autonomous Vehicle Pedestrian Fatality

It’s the kind of fame a person doesn’t want. On March 18, 2018, Elaine Herzberg became the first pedestrian fatality in an incident involving an autonomous vehicle in the U.S. As she walked her bicycle in Tempe, Arizona at around 10:00 pm — in the near dark thanks to the lack of a moon that night — Herzberg stepped out onto Mill Avenue and moments later was hit by an Uber self-driving Volvo.

According to the National Transportation Safety Board (NTSB’s) preliminary report, the Uber Technologies Inc., Volvo XC90 SUV was in computer control mode when it hit and killed Herzberg. The vehicle’s lone occupant was 44-year-old safety driver Rafaela Vasquez. The Volvo had forward and side-facing cameras, RADAR, LiDAR, navigation sensors, and an aftermarket camera system. The safety driver put the SUV into computer control 19 minutes before the accident.

Uber’s technology wasn’t the only safety technology in the vehicle, however. The Volvo also had advanced driver assistance functions, including automatic emergency braking, driver alertness warnings, and road sign capabilities. But when the vehicle is in computer control mode, the Volvo assistance features are disabled. They are enabled when the vehicle is in manual control mode.

If the Uber systems do not perform correctly during testing, the human driver is to step in and take over driving control. That driver is also responsible for monitoring any diagnostic messages the computer sends and ensuring necessary messages are flagged for later review. In other words, the backup driver must be able to monitor the surroundings, monitor the center console for messages, and be prepared to take over driving if necessary in a split second.

Uber System Failed to Recognize Pedestrian Initially

The NTSB notes that the Uber self-driving system did not initially register the pedestrian as a human, instead classifying it as an unknown object, then classifying it as a vehicle. Finally, it classified the object as a bicycle with varying possibilities for future travel path, meaning the Uber system wasn’t certain what the bicycle was about to do. Only 1.3 seconds before impact did the system determine that emergency braking was needed.

The problem is that the Volvo’s emergency braking system is disabled when the SUV is in computer control mode and Uber had also limited the autonomous system’s ability to apply an emergency braking response. As a result, the vehicle operator had to take control and stop the SUV. The vehicle, however, does not alert the operator that intervention is needed. The vehicle’s only occupant, who is tasked with monitoring the vehicle and its console for data, must be alert enough to the outside world to realize that intervention is needed and prevent an accident.

In this case, with less than a second to impact, the driver took control by engaging the steering wheel but the SUV still hit Herzberg at 39 miles per hour. One second after impact, the driver hit the brakes.

Following the crash, the vehicle’s operator said she was monitoring the self-driving system interface, though later reports indicate she was watching The Voice on her cellphone. Herzberg was in dark clothing, did not look both ways, and did not have side reflectors, so it’s not clear if Vasquez would have seen her had Vasquez been paying attention.

Why Was the Volvo’s Emergency Braking Turned Off?

Industry insiders say it is standard to turn off the vehicle’s own emergency braking systems to prevent clashes with self-driving technology. This prevents situations in which the self-driving system picks up a shadow from a tree and the vehicle’s driver assistance system responds by slamming on the brakes. Turning off that feature, however, creates scenarios like the deadly collision with Herzberg, in which the software used isn’t prepared to respond quickly and the humans involved rely too much on the computers, losing focus of their surroundings.

Uber’s system was programmed to recognize cyclists but may not have done so in Herzberg’s case because of how she moved, because she was in a poorly lit area, and because her bike carried many bags, accidentally disguising it.

It makes sense that Uber would require safety drivers to step in where necessary, especially as the self-driving technology is so new and still being tested. Given, however, that they also require their drivers to monitor the console for important messages, and given that humans have a tendency to stop paying attention when they think technology has things under control, it doesn’t make sense that Uber would have only one person in the car while it was being tested.

Uber Blamed Others for Collision

Herzberg was jaywalking and was identified in reports as a homeless person. Her blood tested positive for drugs, which some people took as an opportunity to blame her for the collision. Vasquez, meanwhile, was looking at her cellphone immediately before the collision, which led some people to blame her for the death.

According to Business Insider, who spoke with Uber employees, there’s plenty of blame to go to Uber. The people who created the car not only disabled Volvo’s emergency braking settings, they prevented the vehicle from slamming on its brakes in response to an emergency. They also affected the SUV’s swerving capabilities. Had the vehicle been able to perform either of those maneuvers, Herzberg might still be alive.

So why disable those features? Employees who spoke with reporters say turmoil in Uber may have led to those decisions. Specifically, providing CEO Dara Khosrowshahi with a pleasant ride in an Uber vehicle was a major concern for Uber senior management, who decided that a smooth ride was more important than safety.

Among issues highlighted by Business Insider were warning signals that were not taken seriously, incentives and pressures for teams to progress too quickly, and engineers who were told that the ride should be as pleasant as possible. Beyond all that, sources told reporters that Uber’s Advanced Technologies Group suffers from dysfunction and infighting.

Employees told the news outlet that engineers felt the vehicle’s autonomous software was “immature” and could not even recognize pedestrians in some circumstances, including in near-range sensing. Shade from tree branches could cause the car to stop, call for remote assistance, or pull itself out of computer control because the car classified the shade as a physical object blocking the way.

Sources also said they felt Uber vehicles did not undergo enough testing in simulated settings, before being out on the road. The vehicles underwent some track testing, but that testing was reportedly disorganized, with various teams running different tests and no one sharing the data. If one team tested a feature that failed, they might not alert other teams about that failure. Meanwhile, the people testing the cars on the road complained their feedback wasn’t resulting in any changes to the vehicles.

The Advanced Technologies Group was also concerned Khosrowshahi would shut down the division if he wasn’t deeply impressed with his Uber ride in April. This goal, however, meant that Uber had to tune its vehicles to not respond to everything even though the software still had difficulty classifying what it saw. The engineers responded by saying they would turn off the vehicle’s emergency features such as slamming on brakes or swerving.

The rationale was that by shutting those features off, the safety driver would have to remain alert and would take over in an emergency situation. This strategy might have worked if the vehicles still carried two drivers, but by this point Uber’s vehicles were being tested with only one driver. That one driver still had to monitor the car for messages and remain alert while the vehicle did the driving.

It’s a situation filled with irony that shouldn’t be ignored: self-driving vehicles are marketed as being safer than human-driven because they eliminate the risk of human error — by far the most significant cause of crashes on U.S. roads — but to test the vehicles, Uber relies on a single person in its vehicles to not make any errors while monitoring how the vehicle drives and the vehicle’s surroundings.

Uber Plans to Resume Self-Driving Vehicle Testing

Following Herzberg’s death, Uber pulled its self-driving vehicles from the road. It has since requested permission to resume testing in Pennsylvania and said it would do so with two drivers in all vehicles. The company also said it would activate the automatic emergency braking system at all times, rather than disabling the system.

“Today, we operate our self-driving vehicles with two Mission Specialists in the vehicle,” Uber writes in its report. “The Pilot, or operator behind the steering wheel, is solely focused on ensuring safe operation of the vehicle, while the Co-Pilot, the second operator in the right front seat, is tasked with monitoring and annotating the behavior of the self-driving system via a laptop.” The decision was made, Uber notes, to reduce workload and the risk of distraction, misuse, or fatigue.

Mission specialists will spend only four hours per day in the driver’s seat, and vehicles will be equipped with third-party driver monitoring.

How the vehicles work is laid out in Uber’s safety report — submitted to the National Highway Traffic Safety Administration — including descriptions of LiDAR, cameras, and RADAR. The report also sets out features for the next generation of Uber vehicles, including ultrasonic sensors and vehicle interface modules.

Uber has also hired former NTSB chair Christopher Hart to act as a safety advisor for the company.

Sources:

https://www.forbes.com/sites/davidsilver/2018/11/03/uber-will-resume-self-driving-car-testing-in-pennsylvania/#7580bb053d7e

https://uber.app.box.com/v/UberATGSafetyReport

https://www.businessinsider.com/sources-describe-questionable-decisions-and-dysfunction-inside-ubers-self-driving-unit-before-one-of-its-cars-killed-a-pedestrian-2018-10

https://www.wired.com/story/uber-self-driving-crash-arizona-ntsb-report/

https://www.ntsb.gov/investigations/AccidentReports/Reports/HWY18MH010-prelim.pdf

--

--

Wisner Baum

Appreciative of new technology advancements but keeping a vigilant eye on corporate shortcuts that put profits over consumer safety.