Liability and Responsibility for Autonomous Vehicles

Kenny Wolf
Geek Talk
Published in
5 min readJun 11, 2024

Autonomous vehicles may be part of our future and therefore part of our everyday lives.

The promise is great. According to statistics [1], 90% of all accidents are due to human error (excessive speed, alcohol and drug consumption or simply carelessness). Autonomous vehicles promise to minimize or, in some cases, completely eliminate this risk.

However, a number of regulatory and ethical hurdles must first be overcome before we can make this everyday life possible.

We as a society must decide how we want to distribute liability and responsibility between self-driving cars, drivers, car manufacturers and insurance companies.

In this article, I outline some key points and suggestions for a possible future with self-driving vehicles.

Define keywords

In order to have an informed discussion on this topic, the two key words of the question must first be defined in concrete terms.

In this article, I define them as follows:

Liability

Definition: Liability refers to the legal obligation to pay for a loss or duty. It is about being held financially or legally accountable.

Responsibility

Definition: Responsibility refers to the duty or awareness of taking responsibility for one’s own actions or the actions of others. It is about moral or ethical obligations.

Liability

Before we get carried away with hypothetical future scenarios, let’s take a look at the current legal situation.

Vienna Agreement

In Switzerland, both national and international laws and standards must be complied with.

According to the Vienna Convention [3], for example, the vehicle must always be under the control of the person driving. This agreement was extended in 2016 to include electronic systems. It stipulates that automated assistance systems can always be overridden or switched off by the person driving.

This speaks against a fully autonomous driving experience.

With the current regulations, there must always be a person at the wheel who is able to override and steer the vehicle. This leads us to another point: the driver’s ability. For a vehicle to be steered correctly, the person at the wheel must have a driving license and the associated skills.

Depending on the level of technology, a special driving license may be required for self-driving vehicles, as they may have a complicated functional interface.

In the case of fully automated vehicles, i.e. without a driver, the legal situation would have to be reviewed.

Swiss law

The current Swiss legal system already has laws and regulations for keeping and driving a motor vehicle.

These clearly set out the liability in various scenarios.

Reference to the legislation on liability for motor vehicles, Art. 58 SVG [2]:

  • If a person is killed or injured or property damage is caused by the operation of a motor vehicle, the owner is liable for the damage.
  • If a traffic accident is caused by a motor vehicle that is not in use, the owner is liable if the injured party proves that the owner or persons for whom he is responsible are at fault or that the faulty condition of the motor vehicle contributed to the accident.
  • At the discretion of the judge, the owner is also liable for damages resulting from assistance rendered after accidents involving his motor vehicle, provided he is liable for the accident or the assistance was rendered to himself or the occupants of his vehicle.
  • The keeper is responsible for the fault of the vehicle driver and any assisting persons in the same way as for his own fault.

In my opinion, this legal situation is not sufficient for fully autonomous vehicles. In this case, causal liability or at most the product liability of the car manufacturer applies.

Blackbox

In the event of breakdowns or accidents, it is difficult to prove guilt despite the legal situation. Especially in Switzerland, where the presumption of innocence applies.

In order to simplify traceability, automated systems would have to have recording devices (so-called black boxes), which are also used in aviation. These black boxes record in detail which actions were carried out by humans and which by the machine.

This could regulate verifiability in most cases.

Responsibility

Nun kommen wir zum zweiten Teil dieser Diskussion wo wir über die Ethik und Moral dieser Fragestellung diskutieren.

Ethics Guidelines

There is the hotly-discussed trolley problem.

In which a vehicle with one or more occupants inevitably collides with pedestrians or another vehicle. An algorithm can calculate the probability of survival for various scenarios and decide who survives based on programmed morality.

Should one life be traded for five?

Should young people be preferred over older ones? These questions turn out to be extremely difficult and emotional. Fortunately, these aspects were banned by the EU Commission in 2020. Its published ethics recommendations state that “discrimination based on age, gender or physical condition is prohibited”.

No moral algorithm — Split the Risk

A team of researchers from Munich [1] has programmed an algorithm that is based on the above-mentioned EU guidelines.

This is an elegant approach, as it does not decide on morality/ethics but calculates and shares the risk of all road users in a situation. In this way, the risk is distributed and no decisions are made about the people themselves.

An example from the research team:

When the highway approaches an oncoming truck, the car increases the distance from the center line, i.e. away from the truck.

If the car wants to overtake a cyclist, the car increases the distance to the bicycle, again away from the bicycle.

If both scenarios come together, the car has various options, whereby the options that minimize “either-or decisions” are evaluated first.

  • The car slows down to let the truck pass first and then overtakes the bicycle.
  • If this is not possible, only the distance to the bicycle is increased, i.e. closer to the truck. This is because the overall risk of a collision would be more fatal for the cyclist.

Conclusion

We already have a solid legal system in the area of liability, which provides a good basis for future developments in the field of autonomous vehicles.

These will probably be developed further depending on the state of the technology and social aspects. The legal hurdles remain significant, but at least we are not starting from scratch.

In my opinion, there are bigger hurdles when it comes to responsibility. Although the trolley problem is controversial, it cannot be completely ignored. The decision on responsibility is still decisive. I think the Munich research team’s approach of leaving ethics and morality out of the equation and only deciding on risk is a step in the right direction. But here too, at the end of the day, risk calculations are used to decide life and death in the worst-case scenario.

This still leaves open discussions on how exactly these calculations should be carried out and in whose favor.

This article is a discussion paper for the Interaction Design module at the Bern University of Applied Sciences.

References:

[1] Autonomous driving: Should cars be allowed to decide over life and death?

[2] SATW series “AI and law”: Self-driving cars

[3] Steerless — and everything under control? Self-driving vehicles in the Swiss transport system

--

--

Kenny Wolf
Geek Talk

I write about tech, software development and hacking for non-techies and geeks 🤓 | Software Developer 👾 | Interested in pentesting 👹