Apart from using what they paid up to $12,000 for, Tesla owners defending FSD (Full Self-Driving) also claim that it will make driving safer when fully developed. On a video from Galileo Russell with another one of Tesla influencers, Omar Qazi, that allegation is repeated and immediately refuted after the EV on FSD heads toward a cyclist.
The footage that shows that is published on YouTube as “Self-Driving Tesla Podcast w/ Whole Mars,” as Qazi is more known. You can see it at around 25 minutes, but we’ll save you from all the Tesla promotion by presenting the part that really matters in the entire video, shared in a tweet by the Twitter account @omedyentral.
In that short passage, Qazi was talking about the second FSD recall for performing illegal rolling stops. The Tesla influencer insists that it was perfectly fine and that people do them all the time, which would make it perfectly acceptable that FSD did the same in his opinion.
Soon after that, he tried to put the FSD recall in a positive light by saying NHTSA (National Highway Traffic Safety Administration) cannot stop people from doing rolling stops, but Tesla allegedly can. From promoting rolling stops, it would automatically prevent them. All that was necessary was a “software update overnight” to “make thousands of people drive safer.”
The Gods of Irony did not resist the opportunity: immediately after Qazi said that, Russell’s Model 3 on FSD heads toward a cyclist. The YouTuber manages to prevent the crash at the last moment. Qazi laughs. Russell immediately asks: “Are we going to have to cut that?” The YouTuber then says something very concerning: “It wouldn’t hit him. It definitely wouldn’t have hit him.” This part is not on the video passage below.
Although Russell said he was confident the car would not have struck the cyclist, he intervened. Tesla influencers and investors may eventually decide to see what FSD does to test their faith in the system. The problem is that any failure will seriously hurt another human being, especially in a system that “may do the wrong thing at the worst time.”
Qazi authorizes him to publish the scene by stating: “No, you don’t have to cut it.” That reminds us that Tesla tried to control what FSD users would publish on social media or YouTube about the beta software. It also makes us wonder what was cut to protect the system and the investments these guys confess to having in Tesla stock.
In September, Conor Dalton raised the hypothesis that Tesla is teasing regulators so that they shut down Tesla’s FSD testing. That would give the company the excuse it needs to blame someone else for never achieving the self-driving cars it promised to sell since 2015. With these videos and increasingly more serious demonstrations of how dangerous it can be, that makes more sense than ever before.
In that short passage, Qazi was talking about the second FSD recall for performing illegal rolling stops. The Tesla influencer insists that it was perfectly fine and that people do them all the time, which would make it perfectly acceptable that FSD did the same in his opinion.
Soon after that, he tried to put the FSD recall in a positive light by saying NHTSA (National Highway Traffic Safety Administration) cannot stop people from doing rolling stops, but Tesla allegedly can. From promoting rolling stops, it would automatically prevent them. All that was necessary was a “software update overnight” to “make thousands of people drive safer.”
The Gods of Irony did not resist the opportunity: immediately after Qazi said that, Russell’s Model 3 on FSD heads toward a cyclist. The YouTuber manages to prevent the crash at the last moment. Qazi laughs. Russell immediately asks: “Are we going to have to cut that?” The YouTuber then says something very concerning: “It wouldn’t hit him. It definitely wouldn’t have hit him.” This part is not on the video passage below.
Although Russell said he was confident the car would not have struck the cyclist, he intervened. Tesla influencers and investors may eventually decide to see what FSD does to test their faith in the system. The problem is that any failure will seriously hurt another human being, especially in a system that “may do the wrong thing at the worst time.”
Qazi authorizes him to publish the scene by stating: “No, you don’t have to cut it.” That reminds us that Tesla tried to control what FSD users would publish on social media or YouTube about the beta software. It also makes us wonder what was cut to protect the system and the investments these guys confess to having in Tesla stock.
In September, Conor Dalton raised the hypothesis that Tesla is teasing regulators so that they shut down Tesla’s FSD testing. That would give the company the excuse it needs to blame someone else for never achieving the self-driving cars it promised to sell since 2015. With these videos and increasingly more serious demonstrations of how dangerous it can be, that makes more sense than ever before.
@WholeMarsBlog “… and now with a software update, you can actually make thousands of people drive safer”@Gfilche fuuuuck! are we gonna have to cut that?
— Jeff Airplane (@omedyentral) February 8, 2022
????????????????????????????????????#FSDBeta $TSLA $TSLAQ pic.twitter.com/J8PMAud6aG