Massive Study Shows People Think Autonomous Cars Should Kill Criminals Before Dogs Among Other Revelations

A game-like survey called the Moral Machine has been polling over two million people from 233 countries over the past two years to get their input on how autonomous vehicles should behave, morally, in life-or-death situations. The results from the massive poll have now been published in the journal Nature, and the results are somewhat expected, but there are some surprises.

The Moral Machine is a project of the MIT Media Lab, who made this nice little explanatory video which gives a good idea of the scope of the project and the types of questions being asked:

In some ways, the whole project is a massive expansion of the famous Trolley Problem, which deals with the ethics of making decisions that either kill people in a vehicle, or cause people outside of a vehicle to be killed.

Advertisement

The Moral Machine presented situations like this for participants to judge:

Here, you have to decide whether to kill a family of five with three children and two adults, or five people, four of whom are elderly. It forces uncomfortable choices, like valuing life based on factors like gender and age.

Sometimes, it can seem to just be cruel:

Advertisement

Kill a dog or a baby? Come on!

The Moral Machine team, led by Edmond Awad, found that generally people all over the world chose to save humans over animals, youth over age, law-abiders over lawbreakers, and, if possible, more lives over less lives.

Advertisement

There were interesting regional differences: Latin American countries valued youth over age, while Asian countries tended to do the opposite, and poorer countries tended to give much less of a shit about jaywalking than richer countries.

There were some other surprises. Look at this chart:

Advertisement

This is basically a chart showing who people are most inclined to not kill. It looks like a stroller with a baby gets the top spot, while cats, despite their massive success in internet-distributed videos, are dead last, literally. It’s also worth noting that dogs get preference over human criminals, and if you’re an old, large, homeless man or woman, you may want to avoid the streets once autonomous cars become common.

Also, so far, no carmaker has announced plans to let their cars know your profession, athletic abilities, or whether or not you have a home.

Advertisement
global distribution of Moral Machine participants

In total, almost 40 million individual moral decisions were made in the program, and there’s lots of fascinating patterns to be seen in the dataThe data itself isn’t enough to establish a flawless, usable moral code for autonomous vehicles, but it’s at least a good start, giving us a sense of what we, globally and generally, want these machines to do.

Advertisement

Hopefully, though, they won’t be constantly choosing whether or not to kill babies or puppies.

Share This Story

About the author

Jason Torchinsky

Senior Editor, Jalopnik • Running: 1973 VW Beetle, 2006 Scion xB, 1990 Nissan Pao, 1991 Yugo GV Plus • Not-so-running: 1973 Reliant Scimitar, 1977 Dodge Tioga RV (also, buy my book!)