autoevolution
 

Scary Tesla on Autopilot Crash Shows Overreliance Can Be Fatal

Tesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADAS 7 photos
Photo: via GreenTheOnly
Tesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADASTesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADASTesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADASTesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADASTesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADASTesla using Navigate on Autopilot crashed against a disabled Japanese sedan, showing people still rely too much on the ADAS
We wrote a while ago that all Tesla issues we have covered so far never seem to get fixed: people just get bored with the repetition and decide to jump into the following problem. That is also the impression Philip Koopman has after checking a video that clearly shows Navigate on Autopilot keeps getting involved in easily avoidable crashes, some of them nearly fatal, as the one you can see in the tweet below and our gallery.
The footage was disclosed by the white-hat hacker GreenTheOnly, who gets these videos from computers he buys on eBay and similar classified ad websites. Sometimes, people send him the hardware for him to study. Sadly, the white-hat hacker apparently did not disclose where it happened or when.

We just know that what appears to be either a Honda Accord or a Nissan Altima was involved in a previous crash and could not be moved from the lane in which it was. God knows why someone was trying to open the rear right door (hopefully, only to pick up personal belongings, not a child in the back seat). The unidentified Tesla hits the rear of the Japanese sedan and miraculously does not hurt or kill the pedestrian that was opening its rear door.

In this situation, GreenTheOnly said that the car was on Navigate on Autopilot until about two seconds before the crash. Automatic Emergency Braking (AEB) was activated, but did not prevent the collision. When it did so, it turned off Navigate on Autopilot, which may allow Tesla to say it was not technically engaged when the crash occurred.

Philip Koopman shared this on his LinkedIn page to urge authorities to end this. They should make sure that Tesla vehicles have a proper driver monitoring system working, that autonowashing is prevented, and that advanced driver assistance systems such as Autopilot and Full Self-Driving must “account for reasonably foreseeable misuse.”

In his text, Koopman warns that Tesla fans will blame the driver. However, “casting blame accomplishes nothing” apart from establishing another explicit episode of moral crumple zones. As Madeleine Clare Elish defined it, it “protects the integrity of the technological system, at the expense of the nearest human operator.”

The Carnegie Mellon professor and autonomous vehicle (AV) safety expert goes even further in his analysis. He alerts that “the car is driving and the humans are along for the ride, no matter what disclaimers are in the driver manual and/or warnings.” That is a clear reference to what Tesla does. While it has disclaimers warning that Autopilot and FSD do not make the cars autonomous, the company and Elon Musk have repeatedly said the vehicles drive themselves. Koopman simply agrees with them.

In his opinion, that will happen for as long as vehicles lack “a driver monitoring system and engagement model that provide real-world results.” In other words, one that really makes sure the driver has enough time to regain control of the vehicle in time to prevent crashes such as the one the videos below present.

Koopman warns that the combination of “automated vehicles with a highly problematic approach to safety” is a dangerous situation in which “Tesla is simply the most egregious (example) due to poor driver monitoring quality and scale of deployed fleet.” The company currently has almost 300,000 FSD customers and many more with Navigate on Autopilot. Both systems are under investigation by the National Highway Traffic Safety Administration (NHTSA).


If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Gustavo Henrique Ruffo
Gustavo Henrique Ruffo profile photo

Motoring writer since 1998, Gustavo wants to write relevant stories about cars and their shift to a sustainable future.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories