“They don’t drive like people. They drive like robots.”

Why self-driving cars get into accidents

#Driving

Tue, Oct 17th, 2017 11:00 by capnasty NEWS

According to the Seattle Times, self-driving cars get rear-ended because they drive really weird: while robots "obey the letter of the law," humans "violate the rules in a safe and principled way," causing problems when the two share the road.

What they’ve found is that while the public may most fear a marauding vehicle without a driver behind the wheel, the reality is that the vehicles are overly cautious. They creep out from stop signs after coming to a complete stop and mostly obey the letter of the law — unlike humans.

Smoothing out that interaction is one of the most important tasks ahead for developers of the technology, says Karl Iagnemma, chief executive officer of self-driving software developer NuTonomy.

  424

 

You may also be interested in:

"A cross breed between a freight tram and an electric lorry."
How We Built A 638 HP LS9 Engine For A Corvette ZR1
Self-Driving Car Kit (Sort Of) for Just $10K
Defacing Street Signs Can Send Self-Driving Cars Crashing
The Self-Driving Car is Here Already and It Drives Better Than You