autoevolution
 

After FSD Almost Causes Head-On Collision, AV Specialist Asks NHTSA to Stop Tesla

FSD Video Shows Near-Head-On Collision Avoided by Human Driver 15 photos
Photo: YouTube/Philip Koopman
Tesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Safety Score BetaTesla Safety Score BetaTesla Safety Score BetaTesla Safety Score BetaTesla FSB Beta Request DisclaimerTesla's Request Button for FSD Beta and How It Is Doing on Public RoadsFSD Video Shows Near-Head-On Collision Avoided by Human Driver
Tesla was said to ask FSD Beta testers to sign NDA (non-disclosure agreements) to post only videos that were not detrimental to it. When the story emerged, the company said it would not do that anymore. The videos have been presenting situations that will creep traffic safety experts, such as a near-head-on collision that made Philip Koopman ask NHTSA to act. Again.
The autonomous vehicle safety expert and associate professor at Carnegie Mellon University published a thread on Twitter about that. Koopman started it by sharing a clip from a YouTube video. It shows a driver on FSD Beta waiting to turn left into the parking of a shopping mall. FSD detects an incoming second-generation Nissan Rogue and tries to turn left in front of it anyway.

Right after the man avoids the head-on collision while saying “Jesus Christ” and swearing a bit, the clip reveals the voice of a woman who laughs nervously in the car. She then says that “it almost killed us.” The man adds that it was “FSD, trying to murder us,” and hears from her this: “Oh, my God! That was scary!”

Koopman argues that the excuse that “there have been no FSD crashes” is only valid until a crash happens. That was the same pretext Uber used to present before a collision on March 18, 2018, killed Elaine Herzberg. That episode is known as the first death involving autonomous driving tech.

According to the AV safety expert, the fact that FSD would turn left despite detecting the vehicle coming in the opposite direction would show “deep problems with the safety architecture (if there even is one) in these vehicles.”

Koopman’s analysis of the situation was that it clearly shows that the beta software is far from being “almost ready,” as Elon Musk and Tesla have been claiming for the past five years. According to the professor, “this is automation that needs to get fixed before it gets let loose on public roads. (It's not like unprotected lefts are a surprise edge case!).”

This is not the first time Koopman has asked NHTSA to take measures to prevent Tesla from testing its beta software with untrained customers on public roads. On September 27, 2021, he wrote a text with William H. Widen for “Jurist,” urging the U.S. Department of Transportation to classify FSD Beta as a Level 4 program because of its “actual design intent.”

With that, FSD would have to follow the same procedures other autonomous vehicle tech companies have to obey. Ironically, Tesla and its fans repeatedly claim that Autopilot and FSD are safer than human drivers. If the person in the clip trusted that, the video footage would show a rather nasty accident instead of divine interjections and the sudden conclusion that “it almost killed us.”



If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Gustavo Henrique Ruffo
Gustavo Henrique Ruffo profile photo

Motoring writer since 1998, Gustavo wants to write relevant stories about cars and their shift to a sustainable future.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories