“They don’t drive like people. They drive like robots.”

Why self-driving cars get into accidents

#Driving

Tue, Oct 17th, 2017 11:00 by capnasty NEWS

According to the Seattle Times, self-driving cars get rear-ended because they drive really weird: while robots "obey the letter of the law," humans "violate the rules in a safe and principled way," causing problems when the two share the road.

What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.

Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.

  217

 

You may also be interested in:

NVIDIA Powered Car Learns to Operate a Vehicle by Observing How Humans Drive
"You can buy a car, but you don’t own the software in its computers."
Claromachines
We're Not Ready for Hydrogen Power Yet
Traffic Jam Simulator