Albeit the self-driving car will be "safer, cleaner, and more fuel-efficient than their manual counterparts," they will require some ethical programming in the event of an unavoidable accident: should the car prioritise the lives of the occupants, or should it kill them if it meant minimizing the loss of life of many others? And would people drive a car that could possibly kill them in order to save others?
Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?
One way to approach this kind of problem is to act in a way that minimizes the loss of life. By this way of thinking, killing one person is better than killing 10.
But that approach may have other consequences. If fewer people buy self-driving cars because they are programmed to sacrifice their owners, then more people are likely to die because ordinary cars are involved in so many more accidents. The result is a Catch-22 situation.
|Functional Wooden Cadillac|
|2011 Dodge Charger Commercial: The Future of Driving|
|How Fast Can a Bumper Car Go?|
|Insane 6x6 Off-Road Truck by Mercedes|
|“For the first time in the world, AI will run in an election.”|
|“Never let a good crisis go to waste.”|
|Japanese Robot Serves Ice Cream From Inside a Vending Machine|
|“Wouldn't it be nice if you were rewarded for all of the little good things you do.”|
|CaptchaTweet: Write Tweets in Captcha Form|
|The (Very Scary) People of Public Transit|
|Amateur-Built Electric Car Going After Record Set by Tesla|
|How to Avoid Jury Duty|
|When the Wrong Hastag Can Get You Killed by an Assassination Drone|
|“Instead of consuming fossil fuels, it would then feed surplus electricity into the grid.”|