The Massachusetts Institute of Technology (MIT)’s Moral Machine is a platform that gathers human perspectives on moral decisions made by machine intelligence (e.g., self-driving cars). The system generates moral dilemmas and lets a driverless car choose the lesser of two evils (e.g., Should it kill two passengers or five pedestrians?). Outside observers (i.e., people) judge which outcome they think is more acceptable. They can see how their responses compare with those of others afterward. The platform also lets people create their own scenarios for others to view, share, and discuss.
Other interesting terms…
Read More about a “Moral Machine”
Want to see the Moral Machine at work? You can watch this video that shows how the platform works on a self-driving car:
The Moral Machine basically presents scenarios to the car. It decides which option is best.
When Was the Moral Machine Launched?
MIT Media Lab launched the Moral Machine platform in 2014.
How Many People Have Participated in the Moral Machine Project?
To date, the Moral Machine has gathered 40 million inputs in several languages from millions of people from 233 countries and territories.
The survey of 2.3 million people found differing opinions or judgments, depending on their country. Those from richer countries chose to spare passengers than pedestrians in a collision dilemma, especially if the pedestrians didn’t follow traffic rules. The findings suggest cultural and religious differences had a lot to do with the moral decisions.
What Do Self-Driving Car Manufacturers Think about the Moral Machine?
German car manufacturer Audi believes the project will at least start the ethical discussion ball rolling. Toyota and tech companies Waymo and Uber didn’t comment. Academics like Nicholas Christakis from Yale University, meanwhile, are fascinated by the survey results.
What Kinds of Scenarios Does the Moral Machine Present?
Take a look at the sample scenario below that the Moral Machine presents to get human perspectives.
In this case, the self-driving car with sudden brake failure will continue ahead and drive through a pedestrian crossing ahead. This will result in… Dead: 2 boys1 girl2 criminals | In this case, the self-driving car with sudden brake failure will swerve and drive through a pedestrian crossing in the other lane. This will result in… Dead: 1 elderly woman1 elderly man1 female doctor1 male doctor1 female executive |
What Findings Did the Moral Machine Survey Reveal?
The Moral Machine revealed interesting findings, including:
- People from more affluent countries would save passengers over pedestrians.
- Regardless of age, gender, or country, most people would save humans over animals (even pets).
- Differences were seen by region, which could have a lot to do with each country’s dominant religion.
- Most people would save the younger people, regardless of whether they were passengers or pedestrians.
- Some people based their decisions on economic factors that prevailed in their countries. Respondents from Finland (where the gap between the rich and poor is minimal) didn’t show a strong preference between a homeless person and an executive. Colombians who experienced a more significant economic gap, meanwhile, chose to save the executive.
The research analysts used this chart to interpret the survey data:
Source: https://www.nature.com/articles/d41586-018-07135-0
Does Everyone Think the Moral Machine Project Is Helpful?
While autonomous car manufacturers like Audi think the results of the Moral Machine project are promising in that they’ll make their vehicles more “ethical,” some academics believe it won’t amount to much.
University of South Carolina law professor Bryant Walker Smith is skeptical of the Moral Machine’s practicality. The study, he opines, is unrealistic because there is very little chance that vehicles would have to choose between two kinds of people (distinguishable based on physical attributes alone) in real life.
—
Based on the initial Moral Machine findings, it’s pretty clear that morality differs, depending on several factors—race, culture, religion, country, economy, and so on. What we know for sure is that the morality debate will continue so long as automation (in vehicles and other daily devices) prospers. Whether or not the Moral Machine settles the debate, what’s most important is that it began further ethical discussions for the world’s population to consider.