Interior of the Mercedes F015 Concept Vehicle

There’s No Legal Protection For Operators Of Autonomous Vehicles. Fortunately, There’s An Easy Fix.

By now, you’ve probably heard that autonomous vehicles will soon be publicly available. Although it could be a few years before a fully autonomous vehicle is ready to hit the streets, that timeline may be drastically shorter. Last week, Tesla Motors announced that it will bring autonomous driving on highways via a software update to the Model S this summer.

The National Highway Traffic Safety Administration (“NHTSA”) has left the task of regulating autonomous vehicles to states until it propounds its own regulations. The NHTSA has categories 5 levels of vehicle automation (Level 0 to Level 4). Level 4 (full self-driving automation) means:

The vehicle is designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip. Such a design anticipates that the driver1 will provide destination or navigation input, but is not expected to be available for control at any time during the trip. This includes both occupied and unoccupied vehicles. By design, safe operation rests solely on the automated vehicle system.

Thus far, California, Nevada, Florida, Michigan, and the District of Columbia have authorized Level 3 automation. However, for the most part, these states have not reconciled autonomous vehicle laws with the existing rules of the road (i.e. driving too closely, driving under the influence, distracted driving, etc.)? Florida’s law excluding vehicles in autonomous mode from its ban on texting while driving is one of the few examples of reconciliation.

At the time when the NHTSA issued its guidelines, “[it was] not aware of any systems intended for wide scale deployment currently under development for use in motor vehicles that are capable of Level 4 automation.” Since then, Google has unveiled an autonomous vehicle prototype that doesn’t have a steering wheel or brakes. And Mercedes’ F015 prototype features swivel/reclining chairs and 4K screens for viewing media. Clearly, these vehicles were made with Level 4 legislation in mind.

I presented before the Georgia House Study Committee on Autonomous Vehicles in September. I hope my home state of Georgia enacts Level 4 automation or, at a minimum, passes a comprehensive set of laws that reconciles autonomous vehicles with Georgia’s Uniform Rules of the Road. However, that is very wishful thinking. Simply put, no legislator wants to be the one to permit driving under the influence in autonomous mode.

Thus, autonomous vehicle law likely will be shaped by courts. What’s most intriguing is that the federal government essentially only has control over how cars are made and not how they are driven. Thus, as Elon Musk suggested, the federal government could ban human-driven vehicles and require that all vehicles be autonomous, but the states would be in charge of pretty much everything else.

As you can imagine, a state-by-state scheme for autonomous vehicles would make interstate driving a nightmare. Let’s say you live in a state that permits Level 4 automation, meaning no human operator is necessary (currently, there are no Level 4 states). Operators would only be responsible in a Level 4 state if they tampered with the vehicles hardware or software. That means a driverless Uber could pick you up and drop you off at your destination or you could send your 5-year-old to his grandparents for the weekend. However, that driverless Uber or vehicle with a 5-year-old could not cross into your neighboring state that has anything short of Level 4. Thus, non-Level 4 states are the problem. A very big problem.

Since the beginning of time, there have been debates about the spirit of the law (what harm was the law meant to address?) v. letter of the law (what does the text say?). These ideological differences will be at the core of autonomous vehicle litigation. The following are legal battles that will occur in non-Level 4 states:

Unmanned driving: No harm in having unmanned vehicles because they were made to be self-driving v. alert human must be present in case something goes wrong

Distracted driving: No need to be focused while in autonomous mode because they were made to be self-driving v. alert human must be present in case something goes wrong

Driving under the influence: Intoxicated human driver poses no harm in autonomous mode because they were made to be self-driving v. alert human must be present in case something goes wrong

Notice the trend? I would argue the spirit of the law protects individuals who operate an autonomous vehicle like an autonomous vehicle, but I would not stand a chance against a judge who followed the letter of the law (“If the statute says a human must be able to take over, that’s what the legislators intended.”). And because these legal battles will be over state law, there’s little persuasive value to a favorable opinion in another state.

Thus, the only way to know whether operating an autonomous vehicle is lawful in a non-Level 4 state is to the break the law. It is going to take people sending an unmanned Model S down the highway to know whether it can be done. Ironically, states that have authorized autonomous vehicles require a human operator, but states that have no autonomous vehicle laws are silent because human operation is the default. Thus, an unmanned Model S could be unlawful in Florida, but not Georgia because you cannot violate a law that does not exist.

With regard to distracted driving, people will have to make bold pronouncements that they are not paying attention to the road in autonomous mode (i.e. reading a newspaper or taking a nap in the driver’s seat). Issues of distracted driving primarily would come up in states where there are no laws authorizing autonomous vehicles.

The most interesting, but least likely, judicial battle will surround driving under the influence. Because autonomous vehicles will maintain their lane, the only practical way to identify an intoxicated driver would be if a non-autonomous vehicle collided with an autonomous vehicle. Driving under the influence is a strict liability crime, so it does not matter if the driver of the autonomous vehicle was not at fault. Assuming the driver established the vehicle was in autonomous mode, the question becomes whether the driver was in control. I was only able to find one DUI case where there was a dispute regarding who was in control of the vehicle.

In Bodner v. State, 752 A.2d 1169, 1172–74 (Del. Sup. Ct. 2000), the defendant called the police after the vehicle she was in stalled on a railroad track. The defendant acknowledged that she was intoxicated, but claimed someone from the party was driving. The Delaware Supreme Court reversed her conviction because the trial court gave an improper instruction on having actual physical control of a vehicle. The Delaware Supreme Court held the trial court was to issue the following instruction on remand:

In considering whether or not the defendant was in physical control of the motor vehicle while under the influence of alcohol, you may consider defendant’s location in or by the vehicle, the location of the ignition keys, whether the defendant had been a passenger in the vehicle before it came to rest, who owned the vehicle, the extent to which the vehicle was operable, and if inoperable, whether the vehicle might have been rendered operable without too much difficulty so as to be a danger to persons or property. You may consider these as well as any other facts or circumstances bearing on whether or not the defendant was then in physical control of a motor vehicle which was or reasonably could become a danger to persons or property while the defendant was under the influence of alcohol.

Thus, pursuant to the Bodner analysis, driving under the influence likely would be unlawful in an autonomous vehicle.

Unlike using civil disobedience to change a law, this appears to be one of the rare times when civil disobedience is necessary for clarity of the law. That’s a dire waste of judicial resources. Every state is going to pass autonomous vehicle legislation in the upcoming years. Let’s get it right the first time and authorize Level 4 automation.

*Photo credit: Engadget

Show your support

Clapping shows how much you appreciated A. Jarrod Jenkins’s story.