Moral Machine test – Who should the self driving car kill?

Autonomous and self driving cars will have to decide; in case of unavoidable accident; who to kill and who to preserve in that specific scenario. So, many variables and ethics come in place. Who should it protect ? Younger over older people ? People over animals ? Passengers over pedestrians ? Fitness vs fat people ? People not respecting traffic signs ?

In this game, you get to decide who lives in each scenario and can also create your own scenarios.

Try it here: http://moralmachine.mit.edu/