Uber isn’t the only Autonomous Vehicle maker who should fear killing a child

Michael DeKort
Predict
Published in
4 min readNov 22, 2018

--

Julie Bort from the Business Insider has written an incredible and very thorough story on issues in Uber prior to the death of Elaine Herzberg. It demonstrates all the issues I have been discussing for over a year. Don’t think for a second though that this is limited to Uber. It is an industry wide systemic problem. (Several articles have appeared documenting issues within Tesla engineering ranks. Hopefully more engineers from across the industry will start speaking out.)

Some of the most egregious parts from the article “Uber Insiders describe infighting and questionable decisions before its self-driving car killed a pedestrian” — Business Insider 11–19–2018

· The engineers knew the system was not ready, feared it would kill a toddler and that every person had a 12% chance of dying during a ride

· The company used the shadow driver as a scapegoat (Something Waymo has done recently as well)

· The system spotted Elaine Herzberg 6 seconds prior to the accident. It did not react properly because it was told not to. All to ensure the ride was smoother, especially for an upcoming demo for CEO Dara Khosrowshahi. This included altering the car’s ability to brake and swerve.

· The employees felt pressure to keep the hype going. To provide a comfortable versus safe ride. They were afraid the CEO would cut the program or they would be seen as failures. They wanted to protect their enormous $400k and above salaries.

· There was rampant infighting and miscommunication within the group.

· The system had major flaws. Seeing at night, thinking shadows from moving branches were objects in their path and not being able to predict the path a person would follow.

· Not using enough simulation and the simulation they had was “utter garbage”.

· Asking the public shadow drivers to look away from the road to track metrics on a smart device.

I have been trying to convince the industry about all these issues and how to remedy them for over a year. It is a myth that public shadow driving is the best or only solution to create a fully autonomous vehicle. The fact is it will be impossible to get anywhere close to a driverless state using this method. First there is the time and money that would be required. It is not possible to drive and redrive, stumble and restumble on all the scenarios necessary to complete the effort. That effort would require one trillion miles to be driven and over $300B to be spent by each AV maker. The other problems involve safety. Those being the running of actual accident scenario to train the AI and L3/handover. Handover cannot be made safe, by any monitoring and notification system, as they cannot provide the time to regain proper situational awareness in critical scenarios.

The significant areas of concern being:

· Use of Public Shadow Driving vs the use of proper simulation 99% of the time

· The Simulation being used in the industry has significant shortfalls

· Using a bottoms up Agile approach instead of aerospace/DoD level system engineering practices

· Not creating and using an End -State Scenario Matrix

When will this be remedied? Will it take the actual death of a child or family? Thousands of them? Because that is where we are headed as the scenarios being run progress to those that are complex and actual accident scenarios. Or as Elon Musk and the former head of NHTSA Mark Rosekind, the “Chief Safety Office” at Zoox have said, we need to get used to the deaths, so we can save more lives later? What happened to governments staying out of the way so the industry being able to police itself and create actual best practices? What we have instead is a hype driven Wild West. An industry that is beyond self-destructive. It would be one thing if all they destroyed was equity. The problem is they are destroying lives needlessly. The most amazing part is that if you set safety aside, they are pursuing business suicide. The public shadow driving approach can never come close to producing a true autonomous vehicle. And the lives lost trying are unnecessary. Is this what NHTSA, the NTSB state and local officials want? Recently NHTSA finally stepped up and stopped EasyMile from using an autonomous school shuttle under development. Will they do something here and follow the FAA model without the litany of tragedies that finally forced the government to create them? Or is there a point where the industry has the courage to admit they must flip the paradigm?

More on this can be found in my articles;

The Crash of the Autonomous Vehicle Industry

· https://medium.com/predict/the-crash-of-the-autonomous-vehicle-industry-f71fd26c1ed0

Impediments to Creating an Autonomous Vehicle

· https://medium.com/predict/impediments-to-creating-an-autonomous-vehicle-d3cfee299749

Waymo Public Shadow Driver Blamed for Accident — Shows this Method is Untenable

· https://medium.com/predict/waymo-public-shadow-driver-blamed-for-accident-shows-this-method-is-untenable-4142e0ac6bbb

Former director of NHTSA Dr. Mark Rosekind’s reckless position and track record on Autonomous Vehicles

· https://medium.com/@imispgh/former-director-of-nhtsa-dr-b80decc1ab57

NHTSA saved children from going to school in autonomous shuttles and leaves them in danger everywhere else

· https://medium.com/@imispgh/nhtsa-saved-children-from-going-to-school-in-autonomous-shuttles-and-leaves-them-in-danger-4d77e0db731

Save Autonomous Vehicles companies from themselves — So they can Save Us

· https://medium.com/predict/save-autonomous-vehicles-companies-from-themselves-so-they-can-save-us-1b76bb52d383

Photo by Luke Stackpoole on Unsplash

--

--

Michael DeKort
Predict

Non-Tribal Truth Seeker-IEEE Barus Ethics Award/9–11 Whistleblower-Aerospace/DoD Systems Engineer/Member SAE Autonomy and eVTOL development V&V & Simulation