Tesla Vision is not yet trained to recognize when automatic gates are closing or not fully opened. Several Tesla owners discovered this the hard way, with at least one of them having their car hit hard while driving through the gates. It’s a costly mistake that should remind everyone that FSD Beta requires human driver supervision at all times.
Tesla promised that Full Self-Driving software would be able to drive autonomously “from parking lot to parking lot.” Although the system is not there yet, many Tesla owners took this too literally. Videos of people sleeping behind the wheel of their Teslas were not uncommon, and, sure enough, some bad accidents happened.
Tesla, for its part, makes it clear in its owner’s manual that the FSD software still requires human supervision. Activating the FSD Beta means agreeing to several disclaimers and warnings. But that doesn’t mean humans always behave. The Full Self-Driving software has gotten genuinely good at navigating city streets, making many people let their guard down. However rare, there are situations when the FSD struggles; unfortunately, when this happens, the driver is caught off guard.
Perhaps the best example is the car pileup on the San Francisco Bay Bridge, which was caused in November 2022 by a Tesla Model S encountering a “phantom braking” event. The car slowed suddenly while driving in the left lane and was hit by another vehicle driving behind. The investigation revealed that the car had FSD activated. It happened the same day Tesla opened the FSD Beta program to more vehicles in the U.S. We imagined that the driver was unfamiliar with the FSD software and was surprised to see it braking. Until they realized what was happening, it was too late.
Edward Porter (@Future_is_noww) was careless enough to let their Tesla drive on FSD while going through the automatic gates of a property. A previous car entering opened the gates, but they started to close while the car was driving through. In this case, we’re not sure a human driver would’ve been able to do anything, considering the gates were installed the wrong way. Specifically, they open contrary to the flow of traffic. There’s no blinking light to warn the driver that the gate is closing. Also, properly installed gates should have had a sensor to prevent them from closing when a car drives through.
Nevertheless, it appears that this is not an isolated case. Whole Mars Catalog (@WholeMarsBlog) encountered a similar issue while going through an automatic gate. In his case, the gates had just started closing, but the car decided to charge ahead. Luckily, he was spot on to brake the vehicle and save the situation. It’s good to see responsible drivers always in control of their cars, even after prolonged periods of flawless FSD driving.
“Really frustrating to record a perfect FSD Beta drive and then right at the end it tries to drive through the gate without realizing its closing,” wrote @WholeMarsBlog.
So you should be aware that Tesla Full Self-Driving is not always on top of the situation, even when it appears to be so. It can be the perfect driver 99% of the time and then monumentally fail in the worst possible moment. Happily, this time was just a fender bender, but it was much worse for others.
Tesla, for its part, makes it clear in its owner’s manual that the FSD software still requires human supervision. Activating the FSD Beta means agreeing to several disclaimers and warnings. But that doesn’t mean humans always behave. The Full Self-Driving software has gotten genuinely good at navigating city streets, making many people let their guard down. However rare, there are situations when the FSD struggles; unfortunately, when this happens, the driver is caught off guard.
Perhaps the best example is the car pileup on the San Francisco Bay Bridge, which was caused in November 2022 by a Tesla Model S encountering a “phantom braking” event. The car slowed suddenly while driving in the left lane and was hit by another vehicle driving behind. The investigation revealed that the car had FSD activated. It happened the same day Tesla opened the FSD Beta program to more vehicles in the U.S. We imagined that the driver was unfamiliar with the FSD software and was surprised to see it braking. Until they realized what was happening, it was too late.
Edward Porter (@Future_is_noww) was careless enough to let their Tesla drive on FSD while going through the automatic gates of a property. A previous car entering opened the gates, but they started to close while the car was driving through. In this case, we’re not sure a human driver would’ve been able to do anything, considering the gates were installed the wrong way. Specifically, they open contrary to the flow of traffic. There’s no blinking light to warn the driver that the gate is closing. Also, properly installed gates should have had a sensor to prevent them from closing when a car drives through.
Nevertheless, it appears that this is not an isolated case. Whole Mars Catalog (@WholeMarsBlog) encountered a similar issue while going through an automatic gate. In his case, the gates had just started closing, but the car decided to charge ahead. Luckily, he was spot on to brake the vehicle and save the situation. It’s good to see responsible drivers always in control of their cars, even after prolonged periods of flawless FSD driving.
“Really frustrating to record a perfect FSD Beta drive and then right at the end it tries to drive through the gate without realizing its closing,” wrote @WholeMarsBlog.
So you should be aware that Tesla Full Self-Driving is not always on top of the situation, even when it appears to be so. It can be the perfect driver 99% of the time and then monumentally fail in the worst possible moment. Happily, this time was just a fender bender, but it was much worse for others.
Who is at fault here @WholeMarsBlog ? https://t.co/Wsb9SB1t4B pic.twitter.com/q74SlCo1Ui
— Edward Porter (@Future_is_noww) February 5, 2023
Almost a perfect 0 input drive on Tesla Full Self-Driving Beta 10.69.25.2, but then it tried to drive through an open gate not realizing it was closing right at the end ????$TSLA @elonmusk https://t.co/WeO80cxdw1 pic.twitter.com/0d1HCDRfks
— Whole Mars Catalog (@WholeMarsBlog) February 5, 2023