Should a Self-Driving Car Kill the Baby or the Grandma? Depends on Where You’re From.

The infamous ‘trolley problem’ was put to millions of people in a global study, revealing how much ethics diverge across cultures

MIT Technology Review
MIT Technology Review

--

Illustration: Simon Landrein

By Karen Hao

I n 2014 researchers at the MIT Media Lab designed an experiment called Moral Machine. The idea was to create a game-like platform that would crowdsource people’s decisions on how self-driving cars should prioritize lives in different variations of the “trolley problem.” In the process, the data generated would provide insight into the collective ethical priorities of different cultures.

The researchers never predicted the experiment’s viral reception. Four years after the platform went live, millions of people in 233 countries and territories have logged 40 million decisions, making it one of the largest studies ever done on global moral preferences.

A new paper published in Nature presents the analysis of that data and reveals how much cross-cultural ethics diverge on the basis of culture, economics, and geographic location.

The classic trolley problem goes like this: You see a runaway trolley speeding down the tracks, about to hit and kill five people. You have access to…

--

--

Responses (33)