Autonocast episode leaves Elaine Herzberg’s Death in Vain
I just listened to Autonocast’s podcast #136 on the one-year anniversary of Elaine Herzberg’s death and Ed Niedermeyer’s article on the 10 things that have been learned because of it. Unfortunately it appears Elaine Herzberg’s death has been in vain so far.
While these folks made some good points, they continue to completely miss the boat and wind up enabling more deaths to occur needlessly vs stopping them. They continue to make the extremely flawed assumption that using shadow and safety driving to develop these systems is the best or only method to create them. This will result in thousands more deaths when the AV makers move from benign to complex and actual accident scenarios. Until these folks get on the right side of the issue, they will be part of the problem. The solution of course is for 99.9% of this development to use proper simulation. That being aerospace/DoD/FAA level simulation and systems engineering. (The simulation systems currently being in the industry have far too many technical flaws to be used to get anywhere near L4.)
Items of Note
· At one-point Alex stated that the Tesla’s deaths are not relevant because the Tesla is not “self-driving”. NONE of the systems out there are self-driving. They are ALL in development. “L3” or not. (L3 just means the public safety driver is killed needlessly vs an employee.) The Tesla difference, other than recklessly not using LiDAR, is they are in a much more complex geofence, experiencing much more complex and dangerous scenarios, than virtually everyone else. A set of scenarios others are afraid of but cannot avoid forever. When they do and they all get to even more complex scenarios the death rate will go way up.
· They discussed trust being the most valuable commodity. That is correct. 71% of the public don’t trust these systems now. A number that has gone up since last year. Wait until the first child or family dies needlessly. Or the public understands for the first-time thousands that will die when the AV makers train their systems on accident scenarios. You can’t build trust advocating a method that kills people for no reason. When you use a method, that is killing those people, that can never get close to resulting in a legitimate autonomous vehicle.
· They were positing the Elaine Herzberg tragedy happening to Waymo and if there would be a difference because their credibility (hype) is better. The answer to that isn’t relevant overall. It is impossible for each AV maker using the public shadow and safety driving approach to not kill many, many people. The only reason Waymo hasn’t killed someone is their geofence is much less complex than Tesla. (And they use LiDAR and probably don’t turn off critical systems like Uber did.) Think it through. What happens when Waymo or anyone else gets to those complex or accident scenarios?
· Alex Roy said calling out BS is important again? When I recently did that with Alex, he blocked me on LinkedIn and Twitter vs counter my argument. So much for walking the talk and courage of one’s conviction. (This is after their having me on their podcast, my discussing my POV and them refusing to air it.)
Some more relevant article on the subject
SAE Autonomous Vehicle Engineering Magazine — End Public Shadow Driving
- https://www.nxtbook.com/nxtbooks/sae/ave_201901/index.php
Blocked by Alex Roy for pointing out his Hypocrisy and Recklessness
Common Misconceptions about Aerospace/DoD/FAA Simulation for Autonomous Vehicles
The Hype of Geofencing for Autonomous Vehicles
- https://medium.com/@imispgh/the-hype-of-geofencing-for-autonomous-vehicles-bd964cb14d16
Rafaela Vasquez and Elaine Herzberg are both victims in Uber tragedy — There is a better way