⓱coding – When Coding Decisions Determine Life or Death

The Moral Machine – What Would Uber Do?

Are you familiar with “The Trolley Problem?” It’s an ethical thought experiment with a central moral dilemma.

An observer must decide whether to switch a trolley track when either choice will result in negative consequences for innocent bystanders. If he intervenes more lives are saved [practical] – but he has then taken an active role in who lives or dies [immoral]. If he leaves the switch alone he is moral – but more lives are certainly lost. There is no unequivocal “right choice” – especially in variations when the scenario is more complex.

This problem became relevant again as coders must decide what an autonomous vehicle will do when faced with multiple lethal options. Does it always choose the action with the fewest potential deaths? Protect a vulnerable cyclist over a rugged SUV? Strictly obey the rules of the road regardless of the consequences? Or always protect its owner/driver above all others?

Human drivers often react reflexively or emotionally. But an autonomous car must be programmed in advance how to react. It’s akin to rewriting Isaac Asimov’s ‘Three Laws of Robotics’ for the era of the self-driving car.

The brainiacs at MIT have created a software model to test what you would do if faced with these coding/driving dilemmas. Just open The Moral Machine web site and select the “Start Judging” button.

Your decisions are analyzed, quantified and compared to their growing database. I warn you: Contemplating the consequences of your deadly decisions is very unsettling. But your results will be surprising. And enlightening… Choose wisely and with your heart. Or just call an Uber.

Advertisements