"Misleading and Irresponsible:" That’s How NTSB Chair Qualifies Tesla’s FSD

After Jennifer Homendy became the chair of the NTSB (National Transportation Safety Board), we knew Tesla would be under even more scrutiny. The entity has been urging the company to adopt a safer approach with its autonomous driving quest for a long time. However, Tesla mostly ignored it because the NTSB has no enforcement authority. In an interview with the WSJ (Wall Street Journal), Homendy spared no words about how Tesla names its Level 2 beta software.
Tesla Model S on Autopilot Crashes Against Patrol Car 8 photos
Photo: Laguna Beach Police Department
Tesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSD
The NTSB chair classified “Full Self-Driving” as “misleading and irresponsible.” Although the reasons should be obvious, she meant that the feature is not self-driving and is not even full. No current technology is, and specialists even dispute there will ever be a Level 5 autonomous vehicle, which would require no intervention and would work anywhere. To make matters worse, Musk keeps telling fans that Tesla vehicles are autonomous, as he did in a recent tweet. To authorities, Tesla claims they are only Level 2, which exempts the company from requiring test authorizations.

Homendy also said that Tesla “has clearly misled numerous people to misuse and abuse technology.” That’s an evident reference to the multiple accidents involving Autopilot. NTSB investigated at least one involving deaths. On March 23, 2018, Walter Huang died in a crash against a concrete barrier in California. NTSB’s investigation determined that the driver had an overreliance on Autopilot: he was playing a game on his smartphone.

That was not the first fatal crash in which Autopilot was involved. On May 7, 2016, Joshua Brown’s Model S drove under a tractor-trailer in Florida. More recently, on May 5, 2021, Steven Michael Hendrickson’s Model 3 crashed with an overturned truck in California.

Apart from these collisions, Tesla’s Autopilot is now being investigated by NTHSA (National Highway Traffic Safety Administration) for crashing 11 times with emergency vehicles. Soon after the agency announced its probe, another Tesla slammed an FHP Orlando patrol car.

WSJ interviewed Homendy soon after Elon Musk disclosed the company would expand FSD to more customers by the end of September. He also said that only customers with a good driving record would get the ADAS (advanced driver-assistance system). However, it is not clear how Tesla will decide about that.

Homendy said that Tesla must address "basic safety issues” before “expanding it to other city streets and other areas." They are probably the ones NTSB said Tesla should take care of back in 2017: adding more safeguards to make it more difficult for drivers to misuse ADAS. Tesla was the only company that did not respond to NTSB about these suggestions.

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram X (Twitter)
About the author: Gustavo Henrique Ruffo
Gustavo Henrique Ruffo profile photo

Motoring writer since 1998, Gustavo wants to write relevant stories about cars and their shift to a sustainable future.
Full profile


Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories