Albeit the self-driving car will be "safer, cleaner, and more fuel-efficient than their manual counterparts," they will require some ethical programming in the event of an unavoidable accident: should the car prioritise the lives of the occupants, or should it kill them if it meant minimizing the loss of life of many others? And would people drive a car that could possibly kill them in order to save others?
Here is the nature of the dilemma. Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?
One way to approach this kind of problem is to act in a way that minimizes the loss of life. By this way of thinking, killing one person is better than killing 10.
But that approach may have other consequences. If fewer people buy self-driving cars because they are programmed to sacrifice their owners, then more people are likely to die because ordinary cars are involved in so many more accidents. The result is a Catch-22 situation.
|Automatic: Link Your Cars Data Port to Your Smartphone|
|How Mario Kart Saves Lives|
|How Not to Exit a Parking Lot|
|A Buffalo: the Next Best Thing to a Hummer|
|"Today’s announcement shows how quickly the market is shifting."|
|“To create particles with negative mass.”|
|“People are getting rid of their cash.”|
|“How High Tech Tools Profile, Police, and Punish the Poor.”|
|Naked Preacher Lady [NSFW]|
|Pat the Zombie: A Cruel Adult Spoof of 'Pat the Bunny'|
|“The static fire is one of the last major tests prior to the maiden launch.”|
|Knowing how to operate a smartphone does not qualify as being tech savvy.|
|How to Avoid Jury Duty|
|“World's first passenger drone.”|
|“AI codes its own ‘AI Child’.”|
|CaptchaTweet: Write Tweets in Captcha Form|
|How to Escape Planet Earth|