Driverless Car Technology and the Levels of Automation

Do driverless cars seem like a futuristic dream to you? Is it science fiction or reality? Something that might belong more in the life of a “techie” or a future ride-sharing company? Well, think again, as there are 5 levels used to describe the complexity of self-driving cars. Your current car might be more “self-driving” than you know.

Things were simpler back when automobiles were first sold to the general public. Many early automobiles still resembled horse drawn carriages, the horse as an engine was simply removed from the picture. As evidence of automotive heritage, we still express engine power in terms of “horsepower.” Over time cars and trucks evolved from vehicles that looked and were built like carriages to vehicles that suited the needs, and speed, of self-powered vehicles. Design evolution aside, these were still manned and controlled by a human driver.

Autonomous Vehicles Definition

And so, at what point does a car become driverless? Certainly, it is not as simple as just removing a horse and replacing a mechanical engine. In this scenario, one is removing and replacing the brain (or brains, if you count the horse) that intelligently observed, predicted and guided the vehicle.

It is quite an amazing, and almost unbelievable, prospect. And while self-powered cars were quite a leap forward, humanity had decades of slow development that preceded that innovation, and now we are being tasked with accepting a far more complex and challenging development in a short time frame.

To better enable this transition, it was important to break down and define the levels of self-driving automation. While these levels may not be so important to you, as the consumer, when purchasing a vehicle, they are very critical to the development of safety standards and to the introduction of laws and regulations.

What Are the Five Levels of Self-Driving Cars Automation?

To bring a sense of order to this developing technology, the SAE (Society of Automotive Engineers) formulated a gradient of five levels to define just how automated a vehicle is.

LEVEL 0: Literally ZERO automation. At this level the driver is in full control of every aspect of the vehicle. Even an automatic transmission requires input from the driver, such as choosing Park, Reverse or Drive. There is no cruise control or anything of that nature. Just think of any old school car, be it a Model T Ford, or 1960’s Mustang.

LEVEL 1: Some automation enters the picture here–vehicle systems will have the capability to control speed or steering, sharing control with the driver. A great example is ACC or Adaptive Cruise Control (the car controls the speed while the driver controls steering and direction, plus can take over control at any moment). Parking assist is another example of Level 1 automation.

LEVEL 2: This is sometimes called “hands off” as the car can control steering, acceleration and braking. The driver must still be involved as this level would require driver intervention and at Level 2 it is often required that the driver maintains contact with the steering wheel. A number of higher-end cars, such as Tesla, Mercedes, Volvo and Audi, have these advanced assist features. In many of these systems, the car can take over in an emergency by braking earlier than a person would, or sensing and avoiding an imminent collision before it happens. NOTE: As mentioned earlier, here at Level 2 is where you might find your current car in the scale of driverless tech.

LEVEL 3: Here the car is capable of handling almost all situations and driving circumstances. The car can sense the environment and use that information to perform all functions. However, the driver is still expected and required to be on hand to take over in situations that the car cannot handle. In other words, the car can drive autonomously but requires driver attention and intervention as a fallback.

Here’s where things get interesting. Up to this point, the level of automation required not only full observation of the environment by the driver, but that the driver be ready to take over as needed. However, at Level 3 the car is in control and the driver can be truly “hands off” but not yet “eyes off” as well.

LEVEL 4: Now we enter the “mind off” level of full control by the car. The car can perform all functions and the driver is not needed to remain in the driver seat but could read a book or fall asleep–leaving the car in full control. However, this level is not complete autonomy as Level 4 vehicles are restricted as to where they can operate–such as self-driving a truck on a highway but requiring a human driver to take the truck through a town or populated area. In a circumstance where the vehicle cannot execute a task, or enters a restricted area, then the vehicle will enter abort mode and park–awaiting human intervention.

LEVEL 5: This is full and complete automation, marked by the option to physically remove human controls and so it is sometimes called the “steering wheel optional” level. At this level there is zero human intervention and the vehicles may actually have been manufactured without any means for human intervention. A good example would be a fully robotic taxi or transport. In this mode, vehicles are supposed to be able to communicate with other vehicles on the road, and recognize and properly respond to traffic signals and signs, recognize and avoid pedestrians, and otherwise act in all ways a responsible human driver would under the circumstances, only faster and better. While this may sound somewhat fantastical, much of this technology already exists, and has been pioneered in aviation. Achieving Level 5 automation on a large scale is likely years off, but there are test fleets–notably the Google Waymo fleet–where Level 5 cars are hitting the streets.

Should We Slow Down Development to Lower the Risks of a Self-Driving Car Accident?

A very good question that requires a bit of thought.

On one hand, there is a tremendous push to make driverless cars a reality, for both the profitability and to facilitate sweeping social changes. And, as in any new, disruptive technology there is the race to be the first to the finish line and to dominate the emerging market.

On the other hand, there is the question of how strong this push should be and how balanced against the very real concerns for human life and safety.

Should one open the doors to rapid development of driverless technology, acknowledging the risk to human life, or should one slow it down with tight regulations and controls that may, at best delay development and at worst cause the industry to stall, or even suffocate and die?

Interestingly enough, there is a historical example to be found in England, way back in the 1830’s. At that time self-powered “steam cars” were gaining popularity and would have likely taken over the entire island were it not for the intervention of special interests, safety concerns and the general resistance to new technology. At that time, the government was persuaded to pass numerous laws, regulations and steep tariffs that choked this industry and by 1839 the “steam car” was dead.

What was emerging then as the early dawn of automobiles, died an untimely death due to government interference and controls. Where would transportation be today had these early automobiles been given “full steam ahead?”

And yet, with the recent death of pedestrian Elaine Herzberg in Tempe, Arizona, one has to ask is it worth the risk? Should our streets be used as a laboratory and our citizens human guinea pigs for the development of driverless technology?

It is an old question to ask, does one sacrifice the safety or lives of a few for the benefit of the many?

The yearly increases in traffic related deaths are a fact. The National Highway Traffic Safety Administration (NHTSA) reported an estimate of over 35,000 people killed in 2015, which was a big upswing over the 32,675 fatalities in 2014. Then in 2016, the estimated number of fatalities were up by 10% over 2015.

With NHTSA claiming that 94% of these fatalities were caused by human error, the argument that “driverless cars could save 300K lives per decade” becomes a strong argument to loosen regulations and open the streets to large-scale testing. This, of course, opens up the age old philosophical discussion: Is it ok to sacrifice the lives of a few, in an effort to save the lives of the many? If the former, whose life is it ok to sacrifice? If the latter, is society at large the winner?

Pittsburgh Mayor, Bill Peduto, is a strong proponent of bringing driverless cars to our cities and he is intent upon making Pittsburgh a showcase city for autonomous vehicles. He believes that the risks involved are far out-weighed by the dangers of human error-based accidents. Or as he said:

“… progress and risk walk hand-in-hand… When we were testing airplanes, there was a risk. When we were testing automobiles, there was risk. When we’re testing inoculation, there’s risk. It’s inevitable that at some point there was going to be a fatality.”

And, Congress seems to agree. In the midst of bitter political division and partisanship, the House and Senate have found a rare spot of bipartisan cooperation and agreement–fast tracking the development of autonomous vehicles. Earlier this year the Senate passed the SDA (Self Drive Act) and there is currently a very similar bill working its way through Congress, the AV START Act (American Vision for Safer Transportation through Advancement of Revolutionary Technologies). To speed the technology along, these bills allow for multiple safety exemptions to test vehicles (for example, a driverless car does not necessarily need a steering wheel and so could be exempted from having one).

In essence, these bills vastly deregulate the introduction of driverless vehicles with the aim of speeding up their development and implementation. Many aspects of the bill frighten consumer protection and public interest groups. For example, one such provision removes the right of States to set their own safety standards and regulations protections, forcing them to accept federal guidelines. Another provision allows for up to 100,000 driverless vehicles to be given ‘safety exemptions” so that they might hit the roads and speed up testing rather than limiting these test groups to fewer numbers.

In a letter to Congress, 21 representatives for public safety and interest groups, have petitioned that Congress create a more balanced bill that is less geared towards deregulation and the rapid deployment of driverless technology.

Who Is Responsible for the Death of Elaine Herzberg? Is It the Uber Car that Struck Elaine in Tempe?

The question of responsibility in this case is highly complex. Certainly, in a “driverless” car there is not a driver to blame.

Could Uber be at fault? This seems to be an obvious choice, but was there an issue of negligence or oversight in the rush to bring its version to market?

One potential issue had to do with the fact that Uber was involved in a court trial with Google over allegations that Uber had received and made use of proprietary driverless technology stolen from Google by a former employee. Aside from questions about corporate ethics, there is the question of whether the “stolen” technology was complete or missing vital pieces of programming. Critical parts of this “stolen” technology had to do with the circuitry in the sensor systems that allow the vehicle to “see” the environment around it–so perhaps missing bits of programming were the cause of Elaine’s death since the vehicle did not “see” her in the road.

Could fault be found with the State of Arizona, or the City of Tempe, for proactively seeking and encouraging driverless technology companies to come to Arizona as part of their “open for business” strategy? Over the past several years, the State of Arizona has worked hard to cut down on rules and to loosen regulations in order to attract driverless tech firms and they are openly encouraging the use of their cities for beta testing.

As Henry Jasny, a senior official for a Washington safety advocate group, said, “We are in the Wild West phase of autonomous vehicles, where companies are looking for the state with the least amount of sheriffing going on.”

And what about the driver behind the wheel? The driverless car that struck Elaine did have a driver at the wheel as a part of the testing safety protocols. This driver was unable to react in time to prevent the accident and fatality. Tempe Police Chief Sylvia Moir called the accident “unavoidable” after the police investigation of the scene and review of the video captured by the car.

Based upon the fact that Elaine was crossing well outside the crosswalk along a section of Mill Avenue where pedestrian crossings are prohibited–with posted signage, conclusion that it was an unavoidable accident may well be supported.

On average, it takes a human being 1.5 seconds to perceive a danger and react to it. Computers, properly functioning, can do this much faster. It is uncertain whether a human driver would have seen her crossing mid-street and reacted in time to avoid her, but the onboard camera does have the driver looking down, away from the road, just prior to the collision.

There is no public information, as yet, giving us the “driver’s” version of events. The driver could claim that the driverless technology had lulled him into a state of complacency, and that he was therefore not on full alert to identify potential hazards, such as a pedestrian jaywalking. If this is so, isn’t that the heart of the reason for driverless vehicles, thus knocking the ball back into the tech company court.

These are just a handful of the many questions and unsolved layers of responsibility that the industry, in cooperation with state and federal authorites, will need to confront and unravel over the years to come.

One thing is certain, that if driverless technology were not involved, then Elaine’s death would have been just another statistic in the tens of thousands of fatalities that take place on our roads each year.

Only time will tell if driverless technology will help us to avoid these senseless deaths, most caused by human error, or introduce a new era of fatalities brought on by AI and “gremlins” in machines.

Another thing is certain: the days of human controlled transportation will end. It is only a question of time.