autoevolution
 

NHTSA Upgrades Autopilot Investigation to Engineering Analysis, Finds Disturbing Signs

NHTSA updates preliminary evaluation on emergency vehicle crashes to an engineering analysis 18 photos
Photo: FHP/NHTSA/edited by autoevolution
Tesla Model 3 on Autopilot Crashes Against FHP Patrol CarTesla Model 3 on Autopilot Crashes Against FHP Patrol CarTesla Model 3 on Autopilot Crashes Against FHP Patrol CarTesla Model 3 on Autopilot Crashes Against FHP Patrol CarTesla Model 3 on Autopilot Crashes Against FHP Patrol CarTesla Model 3 on Autopilot Crashes Against FHP Patrol CarTesla Crashes Against Emergency Vehicle in Laguna Beach, CaliforniaTesla Crashes Against Emergency Vehicle in Laguna Beach, CaliforniaTesla Crashes Against Emergency Vehicle in Laguna Beach, CaliforniaTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDNHTSA updates preliminary evaluation on emergency vehicle crashes to an engineering analysis
Tesla became a $1 trillion company based on promises. Its revolutionary battery – the 4680 – will give it an edge over its competitors. It said it will have 1 million robotaxis on the road in 2020. Its cars will be the safest ones ever made. The National Highway Traffic Safety Administration (NHTSA) may have blown up this last promise with a pretty serious discovery about Autopilot.
Remember the investigation about Tesla cars using the system that hit emergency vehicles? From a Preliminary Evaluation (PE), NHTSA turned it into an Engineering Analysis (EA), the step that precedes a recall. More specifically, the PE21-020 turned into EA22-002. To the initial 11 cases that launched the PE, NHTSA included six more situations and excluded one, ending up with 16. In its detailed summary, the safety regulator says this (the bold is on us):

“The agency’s analysis of these sixteen subject first responder and road maintenance vehicle crashes indicated that Forward Collision Warnings (FCW) activated in the majority of incidents immediately prior to impact and that subsequent Automatic Emergency Braking (AEB) intervened in approximately half of the collisions. On average in these crashes, Autopilot aborted vehicle control less than one second prior to the first impact.

The way NHTSA describes this, Autopilot may have been designed to disengage whenever it senses an imminent impact risk. That would be a way for the system always to present flawless numbers and never to be blamed for anything. Elon Musk frequently said that Autopilot was not turned on when he talked about a wreck involving one of Tesla’s cars. What if it was until “less than one second prior to the first impact?” What if it aborted vehicle control in these circumstances by design? Multiple outlets tried to hear what the EV maker had to say about this, and it did not bother to answer.

Tesla Autopilot and FSD
Photo: Tesla
If Tesla configured Autopilot to work like this, it might be comparable to the cheating software Volkswagen used in its EA-189 turbodiesel engines to make them seem less pollutant than they really were ­­– you currently know this scandal as Dieselgate.

Should this be confirmed, all claims that the system makes cars safer immediately lose credibility – and they already lack it. Tesla’s safety reports have been under scrutiny for a while. Cade Metz wrote for The New York Times (NYT) on June 8 that the numbers they present are misleading.

He quoted a Virginia Transportation Research Council study that explains that in more detail. According to the researcher Noah Goodall, “much of the crash reduction seen by vehicles using Autopilot appears to be explained by lower crash rates experienced on freeways.” When he corrected for age demographics, the estimated crash rate actually increased by 10% when Autopilot was used. As some features can only be used in clear weather, that also makes them look safer than they really are. And this was before NHTSA realized that Autopilot disengaged in a way that did not give drivers enough time to react.

Tesla Autopilot and FSD
Photo: Tesla
Before NHTSA revealed this, Metz wrote that older technologies such as AEB provide drivers with a safety net. However, new ones use drivers as their “moral crumple zones,” as Philip Koopman describes this situation and promotes the expression coined by Madeleine Clare Elish. It would have been premonitory if it was not already expected. The shocking bit is that Tesla may have gone beyond the disclaimer saying the driver is in charge at all times.

If Autopilot is programmed to disengage “less than one second” before crashes, it is actively fleeing responsibility. Suppose Tesla must inform whether the system was active when any given impact happened. Autopilot conveniently shutting down allows Tesla to say it wasn’t. Case closed: blame the driver. That worked until very recently, even if previous warnings showed only the whole picture could tell the truth.

In October 2021, the Netherlands Forensic Institute (NFI) revealed at the European Association for Accident Research that it had decrypted Tesla’s data storage system. That allowed its investigators to retrieve more information about crashes than Tesla was willing to share.

Tesla Crashes Against Emergency Vehicle in Laguna Beach, California
Photo: Laguna Beach Police Department
Curiously, one example they presented about how crucial it was involved a Tesla on Autopilot rear-ending another car. NFI said the driver was attentive and regained control of the vehicle immediately when Autopilot warned them to do so. However, Tesla did not disclose that Autopilot was tailgating the car ahead, something they discovered by decrypting the data. If the system follows a vehicle so close and disengages when there is no time to do anything but brace for impact, who’s to blame?

NHTSA’s move shows it is not willing to let Tesla choose what to say and what to do anymore. The preliminary evaluation raised 191 more crashes not connected with emergency vehicles. All of them had an Autopilot feature activated, and the safety regulator includes Full Self-Driving among them. Of these 191 cases, 85 were excluded from the analysis “because of external factors, such as actions of other vehicles, or the available information did not support a definitive assessment.”

In about half of the remaining 106 cases, the driver was not sufficiently responsive. That indicates Tesla does not monitor driver behavior enough, something that Consumer Reports and other specialists have been warning about for a long time. In a quarter of these events, Autopilot was activated in situations where the company warns it may present limitations, such as roadways or low-traction environments. In other words, Tesla would have a strong line of defense in these cases, right? This is what NHTSA has to say about that:

“A driver’s use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect. This is particularly the case if the driver behavior in question is foreseeable in light of the system’s design or operation.”

Tesla Model 3 on Autopilot Crashes Against FHP Patrol Car
Photo: FHP
The irony here is that this is the case only because Autopilot and FSD are labeled as Level 2 driver assistance systems, which demand driver supervision. For NHTSA, “ensuring the system facilitates the driver’s effective performance of this supervisory driving task presents an important safety consideration.” Tesla used this classification to escape what autonomous driving testing demands: trained test drivers only and regular reports to authorities, among other obligations.

Interestingly, it seems it would not have made a difference if all drivers followed what Tesla told them to do with Autopilot. Still in the EA22-002 detailed summary, NHTSA stated that only 2 of the 16 drivers involved in crashes against emergency vehicles got driving engagement strategy warnings five minutes before the impacts. For the agency, that suggests all others were playing by Tesla’s book – which did not prevent them from crashing.

NHTSA concluded that it had enough elements to turn the PE into an EA. There’s a strong possibility that a correction will require more than a simple software update. If one could fix everything, the PE would have died, and 765,000 Model S, Model X, Model 3, and Model Y units would have been considered perfectly fine. Instead, the safety regulator upgraded it and included 65,000 more EVs in the investigation for a total of 830,000 vehicles.

That would be Tesla’s worst nightmare. Although eventual new components and the already crowded Service Centers having to cope with more services would represent a hit, it may be nothing compared to the reputation damage that may follow. Just remember how much Volkswagen had to pay (and is still paying) for Dieselgate.
If you liked the article, please follow us:  Google News icon Google News Youtube Instagram

 Download: EA22-002 Document (PDF)

About the author: Gustavo Henrique Ruffo
Gustavo Henrique Ruffo profile photo

Motoring writer since 1998, Gustavo wants to write relevant stories about cars and their shift to a sustainable future.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories