Letting the World Drive Your Car

Oliver Ash
4 min readJun 27, 2019

--

The future, as imagined in a 1957 advertisement (Source: America’s Electric Light and Power Companies ad)

We live in an unprecedented time of automated technology, but many sense that we are on the brink of a new automation revolution. Self-driving vehicles are widely expected to be the next arrival in this new wave of automation, with many predicting fully automated vehicles will be on the road within the next 1–5 years.

One distinctive characteristic of the new wave of automation is that machines will be making what many see as human “decisions”. When there is a human in the driver’s seat, they can tell us a story about why they did what they did in the case of an incident, and we can choose to blame or praise them based on our understanding of the events and their story. When the vehicle is driverless, the human responsible is a programmer far away in time and space from the vehicle in motion.

How should the programmer instruct the vehicle? Who should decide how the vehicles are to act?

What would happen if we took a vote?

An example scenario from MIT’s Moral Machine game. (Source: Edmond Awad et al., 2018)

This is the question that MIT’s Moral Machine attempts to answer. On their website, users are faced with a number of scenarios in which they have to choose the best course of action for the car to take. Choices are made between saving people and pets, men and women, old and young.

The results of over 11 million iterations of the game have been compiled into a number of public datasets. The following is what I discovered in my explorations of those results.

According to the world:

Despite some variation between different groups, there is clear preference on average for specific groups. People (as opposed to pets), the young, women, people with high social status, and those who are ‘fit’ had the highest probability of being saved by players of the game. See the Moral Machine website or the study’s documentation for more about how these attributes are represented by characters in the game.

(Note: in the following visualizations, “Percentage Saved” indicates the percentage of characters of a given attribute that were saved while the remaining percentage indicates the number of characters of the opposite attribute that were saved. For example, in the following chart East Asians saved women 55.6% of the time, meaning that they saved men 44.4% of the time.)

The chart indicates the percentage of the time characters with a given attribute (see legend) were saved when the user had the choice to save them.

According to gender:

While on average each gender was in agreement with the other about who should be saved, women tended to have a relatively higher degree of preference for women and pets.

Reported gender indicates the gender reported by the user in a form provided after playing the game.

According to education level:

Higher education level is correlated with a lower tendency to save humans over pets and a higher tendency to save those with higher social status.

So, who should choose?

Thanks to the data collected by the Moral Machine game, we have a better idea of how different groups might prefer for driverless cars be programmed. Most of these groups are in broad agreement about these choices as formulated by the creators of the game, but in their degree of preference there are some interesting differences. There are still more interesting ethical dilemmas to be explored and the data contains further insights regarding them.

In the end, we are still left with the question of who should choose how driverless cars should drive. You might not be interested in letting the world choose how your car should drive, but someone has to.

For now, don’t forget you’re still the driver.

I’ll be adding some more detailed charts to my blog.

For more detailed information about the study and its conclusions, see the article published in Nature. The data contains much more information than I could cover here and is worth exploring further for those who have the interest.

--

--