“They don’t drive like people. They drive like robots.”

Why self-driving cars get into accidents

#Driving

Tue, Oct 17th, 2017 11:00 by capnasty NEWS

According to the Seattle Times, self-driving cars get rear-ended because they drive really weird: while robots "obey the letter of the law," humans "violate the rules in a safe and principled way," causing problems when the two share the road.

What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.

Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.

  794

 

You may also be interested in:

A Buffalo: the Next Best Thing to a Hummer
The Self-Driving Car is Here Already and It Drives Better Than You
"Should they teach the cars how to commit infractions from time to time to stay out of trouble?"
"John Deere is the largest operator of autonomous vehicles."
"You can buy a car, but you don’t own the software in its computers."