“They don’t drive like people. They drive like robots.”

Why self-driving cars get into accidents

#Driving

Tue, Oct 17th, 2017 11:00 by capnasty NEWS

According to the Seattle Times, self-driving cars get rear-ended because they drive really weird: while robots "obey the letter of the law," humans "violate the rules in a safe and principled way," causing problems when the two share the road.

What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.

Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.

  542

 

You may also be interested in:

“It saddens me to say it, but we are approaching the end of the automotive era.”
Changing the Oil of Your Car is so Easy, Even a Kid Can Do It
Audi's 24-Hours of Le Mans 2011
Self-Driving 'Robo-Taxi' from Google
Modern Cars Vulnerable to Malicious Hacks