Ethics and AI

Alli Steen
RE: Write
Published in
2 min readOct 9, 2016

I just played This Game

Up until recently, I always had this fantasy in the back of my mind that the future of self driving cars and police bots would make the world a more fair and just place. With an impartial machine to judge between right and wrong and that wouldn’t be clouded by emotion things might be okay. But then, I watched Psycho-Pass, which I wrote about a few weeks ago, then played Moral Machine earlier today. Moral Machine was really the vehicle that drove home [haha puns!] the idea that maybe there is no such thing as impartial programming and that our mechanical overlords may be just as flawed as us.

So, Is there such thing as impartial programming? Can a team or a person writing the algorithms for self driving cars keep out their biases or opinions? Even if they are following a set of instructions made by someone else, what about that persons biases? Self-driving cars are out there, but as the technology progresses, I wonder if all self driving cars will have to be programed to a standard of either “protect the most lives at the potential expense of the passenger” or “protect the passenger at all cost despite how many others may be killed”. I also wonder what kinds of people would choose each programming style and what companies would roll them out. This is assuming their won’t be a standard, but even that standard would prioritize on of those things over the other, and who is to say which one is really better?

I brought up this topic to my friend and this is what he had to say:

“…I think it highlights a lot of flaws with the idea of quantitative morality. Like most people will tell you that the car should injure the fewest amount of people but what if they’re the driver? I think that for the cars to be commercially viable it has to protect the driver at all cost.”- Arthur

So there’s also that to think about, would you buy a car if you knew it might kill you? Most likely car companies will program in favor of their passengers, but we’ll see.

--

--