Waymo Public Shadow Driver Blamed for Accident — Shows this Method is Untenable

Michael DeKort
Predict
Published in
3 min readNov 11, 2018

Waymo recently blamed its Public Shadow Driver for an accident because he took control of the vehicle to avoid an accident, he did not trust the AI to handle. According to Waymo that driver actually caused the accident. John Krafcik, the CEO, stated that their simulations prove this. Of course, Waymo did not provide that proof to the public. Neither the simulation nor test track data. (I would not trust the info if they put it out now. Odds are it was rigged.)

This situation shows several reasons why the use of this approach can never get remotely close to creating a legitimate driverless vehicle. Just look at this scenario. Waymo is asking a human, who has millions of years of fight or flight built in to them, too allow a system under development, not a proven system, to do the right thing in a hazardous condition that may cause their injury or demise. How much simulation testing does Waymo share with these drivers to give them confidence? How much of that is validated against the real world? Has someone proven that simulation itself is working right? My experience in this industry tells me the simulation products in this industry have significant fidelity issues. Examples being proper, not gaming, real-time, as well as vehicle, tire and road models especially when their performance envelopes are pushed or the environment is degraded.

To take this example further AV makers will have to get their drivers, who are not driving professionals, to experience thousands of accident scenarios thousands of times each and sacrifice their lives to drive through those events the best that can be possibly be driven. Anything less and the AI will not be trained well. This is simply not going to happen. (Now imagine what happens when the public, press, governments, insurers etc realize this. Now after the first child or family is killed needlessly. Now after being told they have to accept thousands more.)

Beyond this it is a myth that public shadow driving is the best or only solution to create a fully autonomous vehicle. The fact is it will be impossible to get anywhere close to a driverless state using this method. First there is the time and money that would be required. It is not possible to drive and redrive, stumble and restumble on all the scenarios necessary to complete the effort. That effort would require one trillion miles to be driven and over $300B to be spent by each AV maker. The other problems involve safety. Those being the running of actual accident scenario to train the AI, as we mentioned here, and L3/handover.

For more on why public shadow driving is untenable and how to develop these systems successfully and safely please see my other articles here

Impediments to Creating an Autonomous Vehicle

· https://medium.com/predict/impediments-to-creating-an-autonomous-vehicle-d3cfee299749

Save Autonomous Vehicles companies from themselves — So they can Save Us

· https://medium.com/predict/save-autonomous-vehicles-companies-from-themselves-so-they-can-save-us-1b76bb52d383

By Eduardo Flores on Unsplash

--

--

Michael DeKort
Predict

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation