autoevolution
 

Google’s Self-Driving Cars Are Not So “Self-Drivable” After All, Report Says

Google autonomous car 1 photo
Photo: Google
Everyone knows that California is the driverless cars’ Holy Land, and almost every carmaker is testing this technology on the state’s public roads. A 32-page report carried on by the California-based Consumer Watchdog now revealed more glitches in the system.
Apparently, while traveling on California’s public roads, Google drivers had to intervene 13 times between September 2014 and November 2015 to stop their self-driving cars from crashing.

This is not just a singular case, as six more car tech companies also revealed data about their driverless cars not behaving how they should be. While the American company plans to build cars without any steering wheel or pedals, all the test data makes it pretty clear that it should probably not do that.

John Simpson, privacy project director of Consumer Watchdog, voices this concern: “How can Google propose a car with no steering wheel, brakes or driver? Release of the disengagement report was a positive step, but Google should also make public any video it has of the disengagement incidents, as well as any technical data it collected, so we can fully understand what went wrong as it uses our public roads as its private laboratory.

The report says that, in the specified period, Google operated its cars in autonomous mode for 424,331 miles, and there were 272 cases when the cars’ own software detected a failure and the driver had to take control. Also, there were 69 more events when the driver took control without being prompted because they presumed there was a safety threat.

After 13 driver-initiated interventions, computer simulations indicated there would have been a crash if they had not taken control. While two of these cases would have involved hitting a traffic cone, the other 11 would have been more serious.

The Vehicle Disengagement reports filed by the other carmakers with the California Department Of Motor Vehicles (DMV) show that Tesla’s fully autonomous cars are the safest as their drivers have never had to intervene.

Nissan drivers intervened 106 times in 1,485 miles, to avoid being rear-ended after the car braked too fast or crashing after braking too slowly.

Mercedes-Benz drivers intervened 1,051 times in 1,739 miles, and in 59 of these cases drivers took control of the vehicle without being prompted, because they felt uncomfortable with the software’s behavior.

Delphi drivers stepped in 405 times in 16,662 miles. Twenty-eight of these cases were precautionary, because the car detected nearby pedestrians or cyclists, and 212 were because of the road markings or traffic lights.

Volkswagen drivers intervened 260 times in 14,945 miles, but the German company didn’t specify the reasons. Bosch drivers intervened 625 times in 935 miles, all of these being planned tests.

At the end of last year, California’s legislators proposed a new set of rules for self-driving cars, stating, among other things, that drivers behind the wheel of a fully autonomous car would have to be fully licensed.

John Krafcik, president of Google’s self-driving car project, disagrees with the DMV and says that allowing humans to intervene could make a crash more likely and the car “has to shoulder the whole burden,” as BBC reports.

Prof. David Bailey, from Birmingham’s Aston Business School, states that fully-autonomous cars without manual controls will not hit the streets in the near future, as “For a long period, you will see autonomous vehicles and human-driven cars share the road. That makes the situation more complicated, which makes a strong argument for letting people be able to take back control. From the point of view of people’s acceptance and confidence in the technology, that will be needed anyway.
If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories