The Crash of the Autonomous Vehicle Industry
In the past few weeks several major players in the driverless industry said they now think it will be decades before a real self-driving vehicle is on the road. Among them was Waymo and GM. The leader of that pack is Aurora’s Chris Urmson. He not only had the same realization over 6 months ago he also said disengagements and miles mean little. With scenarios learned being what matters. (Uber has recently stated that as well. I wrote the same thing over a year ago.) There has also been a massive amount of partnerships announced recently. Something I predicted months ago. Why would companies risk sharing IP? They need to spread what they now know will be massive costs. In that very same article I also stated the bankruptcies are coming. That will occur because there is a critical reality those involved in creating these systems still do not see in spite of their incremental epiphanies. That 20–30-year time period isn’t remotely close. The real answer is that they will never get remotely close to finishing. Not much farther than the first base they are on now. The root cause of this is the approach the vast majority of the industry uses to create these systems.
· Use of Public Shadow Driving vs the use of proper simulation 99% of the time — It is a myth that you can drive and redrive, stumble and restumble on enough scenarios, enough times, to get close to building a legitimate autonomous vehicle. RAND says 500B miles at 10x a human. Toyota said a trillion miles. (To do that in 10 years would cost over $300B.) Then there are the safety issues. Those regarding handover and running accident scenarios. Handover cannot be made safe no matter what monitoring and notification system is used. That is because enough time cannot be provided to regain proper situational awareness in critical scenarios. The other issue being thousands of accident scenarios will have to be run thousands of times over to train the AI on those scenarios. That will cause thousands of needless casualties. (Waymo, Ford, Volvo and Aurora have stated handover is dangerous and should not be used. However, each still relies on it because they do not understand how to do this properly.)
· The Simulation being used in the industry has significant shortfalls — Among them being real-time and latency issues. As well as imprecise vehicle, tire and road models. These will lead to significant false confidence. Which will not be found until analogous real-world tragedies occur. (Another related issue is not using full motion simulators. Motion cues are crucial. Be that the presence or lack of them. Without them you cannot train or test properly.)
· Using a bottoms up Agile approach — Agile does not work when systems are large and complex. Not adding a top down systems integrated approach will cost a massive waste of time and money. The hardest scenarios should be worked on at the same time as the bottoms up effort. Usually when the most difficult scenarios work everything below them does as well. And often design and execution flaws are found that translate to rework at the lower end. Too much rework and you will be crippled as well. Aerospace and DoD systems engineering practices should be leveraged here.
· Not creating and using an End -State Scenario Matrix — The validators as well as the builders need to see what done and the most difficult scenarios look like now. Test cases or scenarios , especially those related to operating domains, should also be made public and/or covered by a proper third party. FAA practices should be the guide for this.
Getting back to these epiphanies. How is the investment community going to take these? Spending hundreds of million or billions of $ per year for a couple years is one thing. Now they are supposed to do that for 30 years or more? (Usually people decrease the estimation time with experience not increase them by an order of magnitude.) What happens when they learn those decades actually equals never? Or that the lives lost and the many more that will be lost were unnecessary? Including the unavoidable loss of children and families that will occur thousands of times as these systems train and test their AI on thousands of accident scenarios run thousands of times over? Will the investors, public, insurers, governments and the press think these developers and OEMs were inexperienced, withheld the truth or both? What happens then? Do they plow more money in or bail? With ~200B invested in autonomous vehicles the Nasdaq could take a huge hit when the post-realization tidal wave begins. How far will pride, ego and fear keep these folks from seeing and admitting they have been doing this wrong and have to flip the development paradigm?
(One other comment John Krafcik, Waymo’s CEO said recently, is that there are weather condition that these systems will never operate in. I believe this was said to lower the bar to help make the massive increase in funds needed look less onerous. While empirically true I believe that if the process I mentioned above were used you could get to a point where these systems can function better than a human in any condition the vehicle itself can handle. The vehicle should be the weak link not the AV systems. DoD aircraft fly in some pretty bad conditions. How has that been possible for over 40 years? The right engineering approach and sensors)
More info here
SAE Autonomous Vehicle Engineering Magazine-End Public Shadow Driving https://www.nxtbook.com/nxtbooks/sae/ave_201901/index.php
Common Misconceptions about Aerospace/DoD/FAA Simulation for Autonomous Vehicles https://medium.com/@imispgh/common-misconceptions-about-aerospace-dod-faa-simulation-for-autonomous-vehicles-2b3ad84b0aa1
Using the Real World is better than Proper Simulation for Autonomous Vehicle Development — NONSENSE https://medium.com/@imispgh/using-the-real-world-is-better-than-proper-simulation-for-autonomous-vehicle-development-nonsense-90cde4ccc0ce
The Hype of Geofencing for Autonomous Vehicles https://medium.com/@imispgh/the-hype-of-geofencing-for-autonomous-vehicles-bd964cb14d16
My name is Michael DeKort — I am a former system engineer, engineering and program manager for Lockheed Martin. I worked in aircraft simulation, the software engineering manager for all of NORAD, the Aegis Weapon System, and on C4ISR for DHS.
I am a member of the SAE On-Road Autonomous Driving Validation & Verification Task Force and was recently asked by SAE to lead an effort to establish a new Modeling and Simulation group.
I am a stakeholder for UL4600 — Creating AV Safety Guidelines.
I have also been presented the IEEE Barus Ethics Award and am on the IEEE Artificial Intelligence & Autonomous Systems Policy Committee (AI&ASPC)