Uber’s negligence killed someone. It can’t be allowed to happen again.
Until there are strict safety regulations, self-driving vehicles need to be removed from public roads.
The death of a pedestrian in Tempe, Arizona and the revelations that have followed that tragic event make one thing absolutely clear: Uber is not prioritizing safety in its self-driving vehicle tests. And that’s not just the safety of pedestrians, but the safety of everyone on the road: pedestrians, cyclists, and drivers, including their own test drivers.
Tesla was the first company to have one of its autonomous vehicles involved in a fatal collision where the driver was killed, and now Uber is responsible for the first pedestrian death by self-driving vehicle. Uber is cutting corners on safety and putting their vehicles in situations their technology is not prepared to handle because they’re desperately trying to catch up to their competitors, yet they’re still failing miserably. This can’t be allowed to continue.
An overview of what happened
Let’s backtrack to Sunday for a moment and lay out the facts as we current know them. It was dark. A woman was walking her bike from the median to the other side of the road in an area where the streets are very wide and the crosswalks are very far apart. It’s the typical car-oriented street design that has made Arizona the state with the highest rate of pedestrian fatalities in the United States.
At the same time, Uber’s self-driving test vehicle was moving at 38 mph (61 kph) in a 35 mph (56 kmh) zone, and somehow the LiDAR, radar, and even the camera (in the final seconds) failed to detect that there was a person with a bike in its path. There was also a “safety” driver in the driver’s seat — the company used to employ two drivers per vehicle, but recently cut down to just one in most situations — but that woman seemed to trust the vehicle to drive itself. She was too busy looking at a device in her lap to watch the road, presenting a big question: how much can we really trust these “safety” drivers, who have also been spotted asleep and air drumming on the job?
The police initially reported that the pedestrian may have been at fault, a premature determination they later took back, as the video footage makes such a conclusion very difficult to support. The “safety” driver did not apply the brakes until after the woman crossing the street had already been struck.
Uber can’t be trusted
Uber has been playing catch up in the self-driving vehicle space for a long time; not only is it not working, but it put aside safety long ago. The Waymo-Uber lawsuit over Uber having illegally obtained documents about Waymo’s driverless vehicle program — which was settled in Waymo’s favor — gave us a look at communications between former Uber CEO Travis Kalanick and the former head of its autonomous vehicle program, Anthony Levandowski. They show that Levandowski and Kalanick agreed Uber needed a “strategy to take all the shortcuts we can” and that they were looking for “cheat codes” to try to win the self-driving vehicle race — confirmation that cutting corners was part of the company’s plan, not an unexpected and overlooked development.
During the lawsuit, Uber admitted that Waymo’s tech was better than its own, but the extent of which is truly shocking for a company that is depending on its self-driving vehicle program to save it from running out of money — Uber lost $4.5 billion in 2017 with no path out of its deep, deep hole. Navigant Research’s annual ranking of companies based on their autonomous vehicle tech placed Uber near the bottom, and documents recently obtained by The New York Times show that “Uber was struggling to meet its target of 13 miles [20 kilometers] per ‘intervention’ in Arizona” as of March 2018, compared to Waymo’s reported 5,600 miles (9,000 kilometers) per human intervention. Uber’s autonomous vehicle tech seems like little more than glorified cruise control if it requires humans to take over so often, especially in a place like Arizona which has the best possible conditions for autonomous driving: clear weather and wide streets that are designed to exclude non-vehicle traffic.
And Uber knows it’s struggling. Waymo actually releases quite a bit of data and frequent reports on its autonomous vehicle operations because California has reporting standards that companies must meet. However, Uber left California for Arizona precisely so it didn’t have to release data on its program, and it probably wouldn’t be wrong to speculate it’s because their technology is so unreliable.
Self-driving tests need to be regulated
I’ve been unapologetic in my criticism of the dominant view among technologists that self-driving vehicles will take over streets in a matter of years and have challenged the idea that they represent the future of urban transportation at all. This week, those same people were out in force pointing out the unacceptably high number of fatalities from human-driven vehicles and asserting that self-driving vehicles are safer without any proof to back themselves up.
Even though I’m skeptical of autonomous driving technology, I accept that it will likely reach a point where it will be safer than human drivers, but I don’t accept that we’re already there. Maybe Waymo and GM are nearing that point, though I don’t think we have enough information to make such a determination. However, to say that Uber has reached that point is not just to be dishonest, but to put tech worship ahead of people’s lives.
Nearly 40,000 people died on US roads last year, including almost 6,000 pedestrians, and the vehicle fatality rate in the US is nearly 40 percent higher than in Canada and Australia; but that doesn’t just mean that self-driving vehicles are safer. For every traffic fatality in 2016, there were approximately 86 million miles (138.4 million km) driven by humans, yet self-driving vehicles have not driven nearly that many, and Uber’s vehicles have driven somewhere north of two million miles (3.2 million km). That does not make vehicles driven by autonomous systems safer than vehicles driven by humans.
The death of a pedestrian by Uber’s self-driving vehicle is tragic and should never have happened, but now that it has, it needs to be used as a wake-up call to force politicians to enact stricter safety regulations to govern autonomous vehicle testing. Some companies may be putting a lot of consideration into how to keep people safe, but it’s clear that not all are, and those vehicles can’t be allowed on public roads based on big promises from tech CEOs and their marketing departments that don’t reflect the on-the-ground reality.
San Francisco announced an initiative to get companies to show autonomous vehicles to city officials to prove they’re safe before they go on public roads, but it’s voluntary. I’m not the expert to say what the safety regulations should look like, though they should include strict reporting standards and having to prove autonomous systems are safe before they have to navigate other vehicles, pedestrians, cyclists, etc. What is absolutely clear is that the status quo cannot continue.