The Problem in the Future of Driving

Every time you step into a car you run the risk of getting into an accident.

Your chances of dying in a motor vehicle crash are 1 in 102, this may be morbid but it is just a fact. You are more likely to to a fatality of a car accident than to overdose on opiods or be a victim of a gun assault.

And every year in the United States, 37,000 people lose their lives from vehicular accidents, which is about 101 day. While, according to the National Highway Traffic Safety Administration, 94 percent of serious crashes are due to human error.

So how do we curtail those numbers? Autonomous cars.

And what is an autonomous car? Well have you ever seen I, Robot starring Will Smith, a science fiction action movie about the uprising of functional robots against humanity? Well if you have not here is a short clip of the film.

I, Robot car scene.

As you can see in the clip, Will Smith’s character is looking over pictures in a file while behind the “wheel” of a car and is not paying attention to the road what so ever. And this is what some people call autonomous cars or the future of driving.

The future of driving with complete autonomous vehicles is a long ways away but currently companies like Waymo, Uber and Tesla are developing cars with the ability to navigate roads with little to no human assistance. Now this technology has come with its mistakes along the way, mistakes that have unfortunately taken the lives of people.

Walter Huang, Joshua Brown and Elaine Herzberg

Those three names are people who lost their lives in an accident involving an autonomous car. Walter Huang died after he got in an accident while driving his Tesla Model X in Autopilot Mode. Joshua Brown had the same fate as Mr. Huang, except he was driving a Tesla Model S in Autopilot Mode. And Elaine Herzberg died after being struck by one of Uber’s self-driving cars while crossing the street.

The accidents that took the lives of the previously mentioned people are the only death involving autonomous cars in world. But those deaths have impacted the autonomous car industry heavily. People are now second guessing the use of the technology behind driver less cars and how they should be implemented onto our roads safely.

Walter Huang

Walter Huang died on Friday March 23, 2018 around 9:27 a.m. on his commute to Apple Headquarters where he worked as a software engineer. Mr. Huang passed after his Tesla Model X crashed in a barrier that divided Highway 101 and Highway 85 in Mountain View, California. His death gained media attention because of the vehicle was driving, and then even more once it was discovered he was using the Autopilot Mode.

Now there are many details behind the crash, like: the defective barrier that divided the lanes of the two highways, claims from his family that he took his car to the dealer multiple times to discuss how the car veered to the same barrier while using Autopilot, and Tesla’s statement on the crash that seemed to lack empathy for Mr. Huang’s death. But the main focus is: Why did the car crash?

The answer to that question can be answered with two videos uploaded to YouTube and Reddit respectively, that detail how Tesla’s Autopilot deals with highway dividers.

Shantanu Joshi drives his Tesla in Autopilot Mode to the same spot where Walter Huang crashed.

The video above is from YouTube user Shantanu Joshi, who uploaded a video of his Tesla Model S in Autopilot Mode, driving towards the same barrier that Walter Huang crashed into. As shown in the video, the Model S starts to veer toward the left almost making a bee-line towards the barrier. Joshi then takes control of the steering and centers the vehicle, even before the cars alert system warns him to do so. This video alone shows that Tesla’s Autopilot has a problem with lane dividers that separate a highway from another, or at least that specific lane divider.

But Joshi’s video is not the only one circulating on the internet that shows Tesla’s Autopilot Mode difficulty navigating near lane dividers.

Reddit thread TeslaMotors contains a video of Tesla vehicle veering towards a barrier of a lane divider.

The video above is yet another showing Autopilots trouble steering near a lane divider. In the video, uploaded by Reddit user “beastpilot”, a dash cam recording is shown of a car driving along a highway while suddenly veering towards the barrier. Now the authenticity of the vehicle being a Tesla vehicle and it also being in Autopilot can be questioned. But if you pay close attention at the six second mark you can hear the alert given by the car when driving towards the barrier, that same sound is heard in the previous video too.

So the two videos shown above can help answer the question: Why did Walter Huang’s Tesla Model X crash? The answer simply put is that Tesla Autopilot sensors can have a problem with the two white lines that make a split in the road, like in a lane divider. The Autopilot system thinks those two white lines are a lane themselves so it tries to center itself within the lanes, which is exactly what it is programmed to do. Unfortunately the Autopilot system does not realize that it is not a lane but instead a lane divider with a barrier.

Joshua Brown (right) and Stanley Watson posed in front of Brown’s Tesla Model S.

Joshua Brown died on May 7, 2016, in Williston, Florida, instantly after his 2015 Tesla Model S collided with an tractor-trailer while engaged in Autopilot. Brown, a 40 year old Tesla enthusiast and technology buff from Ohio, became the first person to die in a car crash involving a partly autonomous vehicle. The crash occurred when the tractor-trailer made a left turn in front of the Tesla at an intersection of a divided highway, resulting in the Tesla striking the the trailer and crossing underneath.

Birdseye view of the crash, indicating the turning tractor-trailer and the Tesla Model S going east.

The details behind the crash include: Brown’s reliance on the Autopilot feature (only having is hands on the steering wheel for 25 seconds of the 37 minute drive) and the trailer lacking side guards to prevent Brown’s car from going underneath the trailer. But again the main focus is: Why did the car crash?

The National Transportation Safety Board determined that the probable cause of the accident was a combination of the truck driver failing to yield the right of way to the car and the car driver’s over reliance on the Tesla’s automation feature. And the other view of the accident is that the Autopilot did not notice the oncoming tractor-trailer most likely due to the white trailer blinding in to the brightly lit sky, making it appear there was no traffic ahead.

Overall the Joshua Brown accident can be chalked up as an error on the Tesla’s Autopilot feature and the negligence on the driver’s part; the accident was a combination of rarities that resulted in the unfortunate death.

Elaine Herzberg

Elaine Herzberg died on March 18, 2018 around 10 p.m. in Tempe, Arizona after being struck by Volvo XC90 operating with Uber’s autonomous program while crossing a five lane road with a bike. Herzberg was the first pedestrian to be killed by an autonomous vehicle. The vehicle, who was being operated by autonomous features and human driver, was traveling at about 40 miles per hour and the initial investigation of the crash indicated that the car did not slow down as it approached Herzberg.

So why did the car’s autonomous features fail and strike Herzberg even though Uber’s self driving technology is supposed to detect pedestrians, cyclists and others to prevent crashes? Most likely because the technology cannot accurately predict human behavior yet, making the cause of the crash the interaction between autonomous technology and humans.

Moving Forward

“The reality is there will be mistakes along the way. A hundred or five hundred or a thousand people could lose their lives in accidents like we’ve seen in Arizona.”

Jim Lentz, CEO of Toyota North America, was quoted in an article with USA Today talking about the eventual deaths that are going to occur as the autonomous car industry grows, after the death of Herzberg . Lentz said, “The reality is there will be mistakes along the way. A hundred or five hundred or a thousand people could lose their lives in accidents like we’ve seen in Arizona.” The quote may seem cold and show a lack of emotion for the eventual deaths of many people, but Lentz was just telling the truth. As mentioned before, people die everyday just driving to work or running errands and the cause of those accidents are mainly due to human error. With autonomous car advancements human error could be taken out of the equation, resulting in 35,000 lives being saved annually.

John Krafcik, CEO of Waymo.

However, having said that, there are companies in the autonomous car industry that have not been involved in any accidents or deaths. One of them being Waymo, the driver less car division of Google parent company Alphabet. Waymo is ahead of almost all of their competitors to the point where they are allowed to drive selected Arizona residents in self driving cars with no one in the driver’s seat, even after the Arizona governor suspended autonomous testing. In an interview with CNN Tech John Krafcik, Waymo’s CEO, boasted that the company has put its self driving technology through: 5 million miles of testing on public roads, 5 billion miles of computer simulation models, and 20,000 different tests on private roads.

The reason to mention Waymo and its success is to point to the fact that accidents and deaths involving autonomous cars have been minimal, and pale in comparison to human controlled vehicles involved in car accidents. Krafcik put it best in his interview when he said, “One and a quarter million people die every year [in car crashes.] It’s like 737 crashes every hour of the day. I think our job is to make things as safe as possible and to do our part to improve the situation that we have right now which is clearly not acceptable.”

The future of driving may seem dim right now, but the fact of the matter is it will save lives and make them far easier for everyday life.