Attention

The opinions expressed by columnists are their own and do not represent our advertisers

Wednesday, January 18, 2017

For driverless cars, a moral dilemma: Who lives or dies?

Imagine you're behind the wheel when your brakes fail. As you speed toward a crowded crosswalk, you're confronted with an impossible choice: veer right and mow down a large group of elderly people, or veer left into a woman pushing a stroller.

Now imagine you're riding in the back of a self-driving car. How would it decide?

Researchers at the Massachusetts Institute of Technology are asking people worldwide how they think a robot car should handle such life-or-death decisions. Their goal is not just for better algorithms and ethical tenets to guide autonomous vehicles, but to understand what it will take for society to accept the vehicles and use them.

Their findings present a dilemma for car makers and governments eager to introduce self-driving vehicles on the promise that they'll be safer than human-controlled cars. People prefer a self-driving car to act in the greater good, sacrificing its passenger if it can save a crowd of pedestrians. They just don't want to get into that car.

More

No comments: