autoevolution
 

Studies Show Flaws in Driverless Cars’ Technologies

Studies reveal that autonomous cars achieve a crash-rate double than that of vehicles with human drivers. The technology that is supposed to lead to a world without accidents is flawed, apparently.
Google Driverless cars 1 photo
Photo: Eric Risberg
Are those who say that a driverless car will not be a good idea because it lacks something fundamental, the human factor, right? Well, in light of the new studies, they are apparently right.

The reason why self-driving cars recorded such unexpected results is because they obey the law all the time. We don’t say they should disobey it, or that anyone else should, but they should be taught somehow to apply the law judging by different scenarios. And that.. well, it’s kind of a human thing.

To be more specific, the problem occurs when a car is merging onto a packed highway with traffic going well above the speed limit. Although there were only minor scrape-ups until now, it seems that autonomous cars programmers have a little dilemma. Whether they should teach the cars to cheat or stick to the law.

Raj Rajkumar, co-director of the GM-Carnegie Mellon Autonomous Driving Collaborative Research Lab, decided to stick to obeying the law to the letter.

During a test last year, Rajkumar somehow revealed the flaws of this technology. He invited members of the US Congress to test drive a fully-autonomous car that worked perfectly until it had to merge onto a portion of a congested road and swing across three lanes of traffic in just 150 yards. The car’s sensors detected the traffic but didn’t know to trust the drivers to make room for that maneuver, so the human had to take control of the situation.

The study also revealed that, in almost all instances, the driverless cars would get hit from behind by aggressive drivers who are not used to motorists that follow rules.

Programmers have to outrun another problem as well, a more important one this time. They have to figure out how to teach the cars to make life-or-death decisions in an accident. Bloomberg comes with a very good example: should an autonomous car sacrifice all its occupants by jumping off a cliff to avoid a bus full of children? Well, that’s a question that most of us don’t know how to answer, let alone self-driving cars.

The first company that seems to do something in this regard is Google. The company is working to make the vehicles act like humans, to fit naturally into traffic.

Even if they have already programmed the cars to behave in more familiar ways, humans are still surprised by their reflexes, for example when autonomous cars come to an abrupt stop if they sense a pedestrian on the sidewalk close to them.

Nevertheless, there is still more to accomplish for fully-autonomous cars in order to work perfectly and, until then, the human remains the most advanced “machine.”
If you liked the article, please follow us:  Google News icon Google News Youtube Instagram X (Twitter)
 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories