Waymo’s Self-Driving Taxi Service Set to Start, but is it Safe?

Wisner Baum
8 min readNov 29, 2018

--

Regardless of whether the public is ready, Waymo is pushing ahead with plans to have its self-driving taxis on the road by the end of 2018. While that’s good news for people excited about the future of self-driving cars, for those with concerns about safety, alarm bells are going off all over the place. Current automobile regulations, which don’t require much for safety certification, have safety advocates concerned about the potential for disastrous consequences of Waymo’s self-driving taxis, including preventable fatalities.

Are Waymo Vehicles Safe?

Waymo, which got its start as a Google self-driving car project in 2009, is moving forward with plans to have its self-driving taxi service up and running in Phoenix, Arizona, by December 31, 2018. That’s about two months away. The Waymo taxi service won’t be a pilot project, either; it is intended to be a fully operational, revenue building service. The only difference is that there will not be anyone in the Waymo vehicles’ driver’s seat.

But what has Waymo done to prove its vehicles are safe? So far, it’s operated its pilot project in Phoenix, but it hasn’t had to go through any strict approval process to put its cars on the road and people in the passenger seats. And according to reports, Waymo won’t have to do anything, really, to prove its vehicles are safe before they go out on the road.

That’s because regulators are hesitant to get involved in limiting the use of driverless cars. They say too much regulation will inhibit the process of getting driverless cars on the road, delaying the benefits of autonomous vehicles. With limited regulation, there’s little room for oversight at both the state and federal levels. That means people could be getting into cars that have not had their safety proven through any rigorous testing.

Does Waymo Have to Follow Any Transportation Laws?

Currently, there are transportation laws that Waymo must follow, but even those could theoretically vanish. Waymo must follow Federal Motor Vehicle Safety Standards to put its cars on the road. It does so by using cars that already meet the Federal Motor Vehicle Safety Standards — the Chrysler Pacifica. By adding its technology to Chrysler Pacificas, Waymo doesn’t have to do anything else. Even though the Pacifica is adapted to be driverless, none of the driverless technology must meet any particular standards.

Waymo and other driverless car makers might not even have to meet Federal Motor Vehicle Safety Standards in the future. That’s because existing regulations are outdated when it comes to autonomous vehicles, leaving regulators to debate deregulating driverless cars.

Current laws, for example, require all cars to have a steering wheel and gas and brake pedals. Driverless vehicles don’t need those and in fact, having them could be dangerous. Consider the consequences of a passenger attempting to take control of a fully autonomous car even without an impending emergency. Furthermore, for companies like Waymo that are offering taxi services, leaving the steering wheel and pedals in the vehicle puts the drivers’ seat off limits. So, the answer isn’t as simple as just leaving in the steering wheel and pedals.

Concerns about steering wheels and pedals aren’t just far-fetched. A Waymo Chrysler Pacifica crashed in June 2018, after the operator, who was in the driver’s seat monitoring the vehicle’s systems, fell asleep and accidentally disabled the vehicle’s self-driving system. The Waymo vehicle was on a freeway when the self-driving features were turned off, apparently when the operator’s foot hit the gas pedal. The car gave warnings that the system was disabled and attempted to alert the driver to take control but the driver, being asleep, did not respond. The car then crashed into a median.

It’s lucky the crash did not have more tragic consequences, but the accident highlights what can go wrong when a vehicle’s operator is incapacitated or accidentally disengages the technology and the car does not come to a stop in a safe manner.

In addition to the Waymo crash, there have already been multiple fatal accidents involving self-driving cars, even with drivers behind the wheel to take control. Uber, Tesla, and Lyft partner vehicles have all had mishaps, some with tragic consequences. Those crashes underscore the conflict in allowing self-driving vehicles on the road that have only proven the vehicle itself is safe but haven’t shown the technology is as well.

In March 2018, a pedestrian was hit by a self-driving Uber car even though there was a human in the driver’s seat. The car involved was a Volvo XC90 that had its own standard safety features, but Uber disabled them, in order to prevent “erratic vehicle behavior.” Leading up to the accident, the car’s self-driving technology saw the pedestrian but failed to classify her accurately. At 1.3 seconds before the crash, the car’s technology concluded the emergency brakes needed to be applied to prevent the collision. Uber, however, had disengaged those emergency brakes and the car did not stop in time.

This leads to a vital question: How can regulators say a car is safe if its safety features can be turned off to prevent a conflict with the self-driving technology?

Safety Advocates Face Off in the Push for Autonomous Cars

While many safety-conscious citizens are concerned about the lack of regulation involving self-driving vehicles, other safety proponents say slowing progress could mean additional lives lost unnecessarily.

Fatalities in preventable traffic accidents are a leading cause of death in the U.S., and much of that is due to driver error. Self-driving cars eliminate some of the risks of driver error. After all, if there’s no driver behind the wheel, there’s no risk of driving while intoxicated or speeding.

At the same time, even the most enthusiastic supporters of autonomous vehicles agree that the initial testing and transition process for driverless vehicles will result in some additional deaths and injuries.

Study Shows Waiting for Perfect Driverless Vehicles Costs Lives

It’s not just safety advocates arguing the point. A 2017 study by the RAND Corporation found that waiting until self-driving technology is perfect before rolling it out could cause thousands of preventable driving deaths caused by human error.

The study involved analyzing three scenarios: self-driving cars in use when they are only 10 percent better than human drivers, delaying use until they are 75 percent better than human drivers, and delaying use until they are 90 percent better. The study found that by introducing the “imperfect” cars faster, thousands more lives would be saved over the next 15 years, with hundreds of thousands possibly saved over 30 years.

While safety studies might suggest a decrease in traffic fatalities caused by distracted or impaired drivers, those studies may not take into account potential defects linked to software or sensors that could affect thousands of vehicles and cause accidents of their own. Fatalities linked to human error might drop, but is it worth it if there is a significant increase in fatalities linked to technical defects?

Not All Safety Groups are Convinced Autonomous Vehicles are Ready

Critics of deregulation say that pushing self-driving cars on the road without proving their safety isn’t the answer. They argue doing so opens the door to other catastrophes: accidents linked to vehicles that aren’t safe enough to be on the road. Furthermore, they argue, without adequate safety measures there is a greater likelihood of the public rejecting self-driving cars.

In a March 5, 2018, letter to the Majority and Minority leaders of the U.S. Senate, 28 people representing a variety of safety advocacy groups — including the Consumer Federation of America, the Center for Auto Safety and the Emergency Nurses Association — expressed their “strong objections” to the AV START Act (American Vision for Safer Transportation Through Advancement of Revolutionary Technologies), an act that would deregulate self-driving cars and provide exemptions from current safety regulations.

In their letter, they argue that the AV START Act was written to solve a problem that has no basis in reality. They note that the issue delaying driverless cars are less about regulations and more about developing adequate technology and addressing operational issues, including weather problems and traffic concerns.

“Baseless and exaggerated predictions about the readiness and reliability of driverless car technology are propelling legislation that significantly strips the current federal regulatory system of its appropriate authority and oversight, thereby endangering the safety of everyone — both motorists and non-motorists,” the advocates wrote.

The letter demands changes to the AV START Act such as:

  • Reducing the number of vehicles given exemptions from safety standards.
  • Removing the section allowing autonomous vehicle manufacturers to turn off vehicle systems at their discretion.
  • Establishing minimum performance standards.
  • Ensuring consumers have adequate information about autonomous vehicles.
  • Including Level 2 autonomous vehicles in safety provisions.
  • Addressing the needs of people who are disabled.
  • Removing provisions that prevent states from developing their own regulations.

Public Perception Vital in Accepting Waymo’s Self-Driving Cars

Public perception is also critical in how easily self-driving cars will be accepted. People tend to have less acceptance of deaths that are caused by a machine or computer error as opposed to deaths caused by human error. A study conducted by Advocates for Highway & Auto Safety found that 63 percent of respondents were not comfortable with self-driving vehicles receiving mass exemptions from current safety standards, while 75 percent were not comfortable with vehicle equipment — such as the steering wheel and pedals — being disconnected.

A vast majority of respondents — 73 percent — supported safety standards being developed to regulate features related to driverless cars, with 84 percent supporting rules that require human drivers be alert, so they can take control of the vehicles when necessary. Meanwhile, 81 percent supported cybersecurity rules to prevent hackers from attacking driverless cars.

With several accidents already making headlines, many people are likely casting a wary eye towards autonomous vehicles. Proving that the vehicles are safe is one way to win acceptance. Simply putting these new vehicles on the road and proclaiming them to be safer than the old vehicles likely won’t convince the general public.

Some safety advocates note that the National Transportation Safety Board (NTSB) is investigating five crashes involving self-driving vehicles. They urge those investigations to be completed before legislation is passed.

If self-driving car makers and technology companies like Waymo say a primary reason to allow their vehicles is related to their increased safety, they should be required to prove that their vehicles are actually safe, rather than expecting people to believe their claims without proof. They might not have to show their vehicles are perfect, but many believe they should at least be required to prove they are safe.

Sources:

https://arstechnica.com/cars/2018/10/waymo-wont-have-to-prove-its-driverless-taxis-are-safe-before-2018-launch/

https://arstechnica.com/cars/2018/03/congress-debates-allowing-tens-of-thousands-of-cars-with-no-steering-wheel/

https://www.consumerreports.org/autonomous-driving/faster-rollout-self-driving-cars-would-save-lives/

https://mashable.com/article/jaguar-i-pace-electric-waymo/#cvyky.JKXPq3

https://www.forbes.com/sites/bernardmarr/2018/09/21/key-milestones-of-waymo-googles-self-driving-cars/#f8eb03153690

https://www.claimsjournal.com/news/international/2018/10/04/287100.htm

https://www.wgbh.org/news/national-news/2018/10/05/transportation-department-looks-to-clear-the-road-for-cars-without-steering-wheels

https://venturebeat.com/2018/10/04/u-s-will-revise-road-safety-rules-for-fully-self-driving-cars/

http://saferoads.org/2018/03/05/letter-to-senate-leaders-on-driverless-car-bill/

https://jalopnik.com/googles-waymo-self-driving-car-crashed-after-driver-doz-1829520224

https://news.google.com/articles/CAIiEIwmFC8MOJVjHKmD9Ac49ZQqGAgEKg8IACoHCAowi4-MATDXsRUwhPieAw?hl=en-CA&gl=CA&ceid=CA%3Aen

https://jalopnik.com/ubers-autonomous-car-had-six-seconds-to-prevent-fatal-c-1826290552

http://saferoads.org/wp-content/uploads/2018/01/AV-Poll-Report-January-2018-FINAL.pdf

https://venturebeat.com/2018/10/04/u-s-will-revise-road-safety-rules-for-fully-self-driving-cars/

https://www.cnbc.com/2018/06/12/self-driving-car-legislation-in-congress-doesnt-go-far-enough.html

--

--

Wisner Baum

Appreciative of new technology advancements but keeping a vigilant eye on corporate shortcuts that put profits over consumer safety.