With the possibility of robots becoming more and more common, sporting an ever more sophisticated artificial intelligence, and having the ability to perform tasks without supervision, Jerry Kaplan of Humans Need Not Apply fame explains the importance of giving them a moral code to abide to — especially if we start treating them as if they were alive.
I’m actually working on a project on this at Stanford. I don’t want robots pushing ladies off the sidewalk as they’re moving, that’s bad. And so that’s a design problem. The sidewalk isn’t designed for robots. We need to program the robot so that it would obey social conventions and give priority to people, and are able to deal with moral challenges. Not too many people in the field of robot building are thinking about or worrying about this issue.
Think of it in terms of people and animals. Animals will take actions independent of their owners and you have a certain level of responsibility to control that animal but it is not as absolute as you might think. Your dog can go bite somebody and your liability is limited to certain kinds of things. They actually have a legal term for this now, it’s called the first bite theory. Once it has bitten somebody, now you are liable if it takes a second bite.
|Rubik's Cube Solving Machine|
|Randomized Consumerism: Darius Kazemi's Bot That Buys Him Random Crap from Amazon|
|Using Miniature Robots as an Interface Mechanisms|
|RHex: All-Terrain Cricket-Like Miniature-Robot|
|“Social robots will be uniquely personal.”|
|“When Life Gives You Lemons.”|
|How to Avoid Jury Duty|
|Japanese Robot Serves Ice Cream From Inside a Vending Machine|
|“The more employees are watched, the harder they try to avoid being watched.”|
|The (Very Scary) People of Public Transit|
|“The prospects and future of AI.”|
|CaptchaTweet: Write Tweets in Captcha Form|
|Why, Typewriters Are Alive and Well, Thank you|
|“Robots are key to a new wave of local agriculture.”|
|“The robot age is nothing to be worried about.”|