If self-driving car killed someone, who is guilty?

Source: Unsplash

Self-driving cars are the near future, we know that. It’s awesome to know that something like this is possible, but there is the dark side of technology.

Let’s go straight to the point. There is a big ethical question who should get killed in some circumstances.

Tunnel problem

You are traveling along a single lane mountain road in an autonomous car that is fast approaching a narrow tunnel. Just before entering the tunnel a child attempts to run across the road but trips in the center of the lane, effectively blocking the entrance to the tunnel. The car has two options: hit and kill the child, or swerve into the wall on either side of the tunnel, thus killing you. How should the car react? robohub

In this situation, both outcomes will certainly result in a death. But, which dead? Who should be the more valuable person to justify the death of other? Computers can’t think, they can just follow the orders, so if it’s necessary to kill someone to follow the order they will do it.

Responsibility?

It’s clear that the person who is in the car will be label as more valuable. If it’s not the case, then the family of a driver could sue the company. There is no company that will take the risk.

Who should be responsible for results? Who will give the order? The company that is making those cars? Programmers that actually wrote the source code that will decide?

Protect me, screw them

I read a few articles about putting the responsibility on the driver. The person who buy the car should decide who should be protected: the driver or the pedestrian. It’s not an easy decision, if someone got killed you will be guilty for it because you gave the order.

Some people say that they would never sit in the car knowing that they are more valuable person then others on the road. It’s a moral, it’s ethics and as a humans we can’t run away from our feelings. Who am I to play the Good? Who am I to decide who should die? But, of course someone will not have a problem with that, it could be a easy decision for them, protect me screw them.

Programmer’s responsibility

It’s the same like a war, if a soldier kill someone he could say ‘It was my job. I just followed the order’. If programmer wrote the if else condition that will decide to go straight and kill a pedestrian. What do you think, how he will feel knowing that he wrote the piece of code that actually makes a decide to kill someone?

Well, a programmer could do the same ‘It was my task, I just followed the order’. I don’t know how I would feel if I see the news that car that uses my software decide to kill a pedestrian because of my condition in a code. I am glad because I don’t make a software in car or military industry.

Self driving cars are already much safer than human drivers, and it’s been projected that they could eliminate 90% of traffic fatalities. But, in situations when someone has to die, who should give the order?

Please respond to this question: Tunnel problem: Who should die? at this questionnaire http://bit.ly/2igJKLL