Right after the man avoids the head-on collision while saying “Jesus Christ” and swearing a bit, the clip reveals the voice of a woman who laughs nervously in the car. She then says that “it almost killed us.” The man adds that it was “FSD, trying to murder us,” and hears from her this: “Oh, my God! That was scary!”
Koopman argues that the excuse that “there have been no FSD crashes” is only valid until a crash happens. That was the same pretext Uber used to present before a collision on March 18, 2018, killed Elaine Herzberg. That episode is known as the first death involving autonomous driving tech.
According to the AV safety expert, the fact that FSD would turn left despite detecting the vehicle coming in the opposite direction would show “deep problems with the safety architecture (if there even is one) in these vehicles.”
Koopman’s analysis of the situation was that it clearly shows that the beta software is far from being “almost ready,” as Elon Musk and Tesla have been claiming for the past five years. According to the professor, “this is automation that needs to get fixed before it gets let loose on public roads. (It's not like unprotected lefts are a surprise edge case!).”
This is not the first time Koopman has asked NHTSA to take measures to prevent Tesla from testing its beta software with untrained customers on public roads. On September 27, 2021, he wrote a text with William H. Widen for “Jurist,” urging the U.S. Department of Transportation to classify FSD Beta as a Level 4 program because of its “actual design intent.”
With that, FSD would have to follow the same procedures other autonomous vehicle tech companies have to obey. Ironically, Tesla and its fans repeatedly claim that Autopilot and FSD are safer than human drivers. If the person in the clip trusted that, the video footage would show a rather nasty accident instead of divine interjections and the sudden conclusion that “it almost killed us.”
"There have been no FSD crashes" will be true ... right up until the fatality when it is no longer true.
— Philip Koopman (@PhilKoopman) October 31, 2021
Does anyone not remember Uber ATG test operations making the same argument right up until they killed a pedestrian? @NHTSAgov
h/t @samabuelsamid https://t.co/BCfwNaV4gE
Any responsible test organization would ground the fleet based on this single frame capture: Tesla FSD 10.3.1 automation intentionally turning in front of detected, close opposing traffic. (Note the steering wheel is actually turned left -- it is making the turn.) @NHTSAgov pic.twitter.com/tOhm6R9e3e
— Philip Koopman (@PhilKoopman) November 1, 2021
Tesla Beta Tester and YouTube influencer shares a strategy for pedestrians to opt out of being involuntary human test subjects. pic.twitter.com/YQT4xw6F8f
— Philip Koopman (@PhilKoopman) November 1, 2021