A Mercedes Car
Photo by Aaron Huber on Unsplash

Who’s Responsible When Cars Go “Eyes Off”?

Cassandra Burke Robertson
6 min readAug 27, 2023

With the launch of its “Drive Pilot” autonomous feature in late 2023, Mercedes-Benz has made history as the first automaker to introduce a Level 3 “eyes off/hands off” car. The car is designed to handle all aspects of driving under specific conditions without human intervention. Scheduled for availability in the 2024 S-Class and EQS Sedan models in California and Nevada, the first units are expected to reach customers later this year.

What Happens When Things Go Wrong?

Mercedes-Benz suggests that “well-established legal systems for determining responsibility and liability on roads and highways” will sort out who’s financially responsible for accidents. The practical implications of this claim, however, remain ambiguous.

Mercedes automobile
Photo by Dhiva Krishna on Unsplash

Individual Liability: The Driver’s Seat

In a conventional vehicle, liability for road accidents falls primarily on the driver. The law assesses this based on a negligence framework involving four elements, all of which must be met in order to establish liability: the duty of care, breach of that duty, harm to a third party, and economic damages. Common mistakes like speeding or failing to yield often indicate negligence.

Chart summarizing the levels of automation

In Level 3 autonomous cars, drivers are encouraged to disengage from active monitoring. As Mobileye CEO Amnon Shashua has explained,

When you have an eyes-off system, there is no need to expect a driver to do anything. It means that, say you have a hands-off/eyes-off system designed for a highway ODD. You punch in an address, once you get onto a highway, you can let go — legally. That’s the whole idea of eyes-off. You can go to sleep. You can do whatever you like.

If the person in the driver’s seat has no duty to actively monitor the autonomous system, they cannot be faulted when that system fails. Therefore, it is likely that drivers will be held liable only for misuse of the system, such as overriding safety features or using autonomous mode inappropriately.

Manufacturer Liability: The Automaker’s Role

People may assume that if the driver is not liable for harm, then the manufacturer must be. That is not necessarily true.

For automakers like Mercedes-Benz, liability rules shift from individual negligence to whether the product had a defect — design, manufacturing, or marketing — at the time it was sold. A defect could be a faulty brake system, an airbag that fails to deploy, or even misleading marketing that falsely claims certain safety features.

Even in conventional cars, the question of defect is frequently litigated. No matter what the alleged defect is, it’s not enough just to show that the product failed. In order to prove that a product was defective, the plaintiff will generally have to establish either that the product was “dangerous beyond an ordinary consumer’s expectation,” or that “the manufacturer did not produce a less dangerous alternative that was economically feasible.

The notion of a “defect” is likely hard to establish for autonomous and semi-autonomous vehicles. It’s possible that overall, the vehicle with autonomous features can be operated more safely than one driven by an average human driver.

Nonetheless, the general public understands that there are likely to be “edge cases” the autonomous system is unprepared to handle. Just this month, for example, one robotaxi in San Francisco drove into wet concrete and another got hit by a firetruck.

Photo by Connor Betts on Unsplash

Even if an autonomous vehicle causes an accident, it does not automatically imply that the car or the system was “dangerous beyond an ordinary consumer’s expectation.” The existence of edge-case risk is within the scope of consumer expectation. Even with such risk, the vehicle may still be safer than the average human driver.

Nor is it likely that the manufacturer could be faulted for failing to produce “a less dangerous alternative that was economically feasible.” These vehicles employ sophisticated algorithms, machine learning models, and sensor arrays to interact with a dynamic environment. Each iteration of the hardware and software offers tremendous improvement from earlier versions. This rapid development makes it nearly impossible for a plaintiff to be able to show how a manufacturer could have offered a better product at the time of sale.

Photo by Tingey Injury Law Firm on Unsplash

Liability Gaps

Mercedes-Benz claims that “well-established legal systems for determining responsibility and liability of roads and highways” are sufficient to handle increased levels of autonomy.

In a recent law review article, however, I explain that a liability gap is created when the system wasn’t misused by the driver and didn’t meet the legal standard for a “defect” but nevertheless caused an accident. This gap may leave innocent bystanders without any recourse, responsible for carrying their own medical or repair costs.

Is There a Better Way?

One common policy analysis tool is to ask which party is best-situated to avoid the harm — that is, who is the “least-cost avoider”? When it comes to automated vehicles, the manufacturers may be better positioned to bear the costs of accidents arising from the vehicles’ use than are other road users. Pooled risk — like no-fault insurance — may also help ensure that injured individuals do not have to bear the costs alone.

In recent years, scholars have analyzed how liability might be better allocated in accidents involving autonomous vehicles. Each offers intriguing proposals that policy makers should take note of:

  • Professors William Widen and Philip Koopman have suggested one way to ensure that manufacturers internalize the cost of accidents. They propose the creation of a new legal category: a “computer driver,” that would bear responsibility for driving behavior that would be considered negligent if done by an “attentive and unimpaired” human driver. The computer driver’s responsibility would be imputed to the manufacturer.
  • Professor Mark Geistfeld has argued that manufacturers need not bear the entire cost of crashes. Instead, he has recommended a combination of federal regulation to ensure a baseline of safety and has suggested that an increased reliance on no-fault insurance could ensure that harms are compensated even without an allocation of fault.
  • Professor Tracy Hresko Pearl has proposed a federal “no-fault victim compensation fund” that could balance the the need to ensure that could protect road users, avoid unnecessary litigation, and make sure that courts have the time and space to develop legal doctrine applicable to autonomous vehicles.

The Road Ahead

As we venture into an era where vehicles with autonomous features like Mercedes-Benz’s “Drive Pilot” increasingly share the road with human drivers, questions of liability and responsibility take on new dimensions. Although Mercedes has suggested that current law is sufficient to address potential issues of liability for Level 3, the legal frameworks that served us well in the past don’t easily address the complexities introduced by these advanced technologies. Despite manufacturers’ assurances, gaps in current laws suggest that revisions are needed to fairly assess both human and machine behavior in accidents.

--

--

Cassandra Burke Robertson

John Deaver Drinko—BakerHostetler Professor of Law and Director, Center for Professional Ethics at the Case Western Reserve University School of Law.