"The biggest challenges facing automated cars is blending them into a world in which humans don’t behave by the book."

The problem with Google's self-driving car are human drivers

#Driving

Thu, Sep 3rd, 2015 11:00 by capnasty NEWS

The problem with Google's self driving car is that it's too safe, and when faced in an environment of human drivers who don't quite follow the rules by-the-book, not only it doesn't know what to do, but it can lead into human-caused accidents. As the head of software for Google’s Self-Driving Car Project Dmitri Dolgov explains, “human drivers needed to be less idiotic.”

Google’s fleet of autonomous test cars is programmed to follow the letter of the law. But it can be tough to get around if you are a stickler for the rules. One Google car, in a test in 2009, couldn’t get through a four-way stop because its sensors kept waiting for other (human) drivers to stop completely and let it go. The human drivers kept inching forward, looking for the advantage — paralyzing Google’s robot.

It is not just a Google issue. Researchers in the fledgling field of autonomous vehicles say that one of the biggest challenges facing automated cars is blending them into a world in which humans don’t behave by the book.

  881

 

You may also be interested in:

NVIDIA Powered Car Learns to Operate a Vehicle by Observing How Humans Drive
Self-Driving Car Kit (Sort Of) for Just $10K
Self-Driving Cars Will Make You Sick
"In a few years, a long-range, affordable electric car will no longer be a novelty."
Auto Buds: Identical Cars Parked Next to Each Other