autoevolution
 

Tesla's FSD Beta V11.3.1 Is Not That Safe, Will Try To Run Through Red Lights

Tesla's FSD Beta V11.3.1 will run through red lights 6 photos
Photo: @WholeMarsBlog via Twitter | Edited
Tesla's FSD Beta V11.3.1 will run through red lightsTesla's FSD Beta V11.3.1 will run through red lightsTesla's FSD Beta V11.3.1 will run through red lightsTesla's FSD Beta V11.3.1 will run through red lightsTesla's FSD Beta V11.3.1 will run through red lights
Tesla has released the FSD Beta V11.3.1 to a wider group of beta testers, claiming that it solves the issues that prompted the voluntary "OTA recall." The first videos show that the software has become more human-like, including bad behavior in certain situations.
Tesla had stepped back from its bullish self-driving claims when it admitted that the FSD software is just a Level 2 driver assist system. This upset many Tesla fans, although it was probably the wisest thing to do. Tesla also stopped software deployment and worked to solve issues raised by NHTSA in an ongoing investigation. The EV maker still called it a voluntary recall, not fully agreeing with the agency, and promised to solve them via an OTA update.

The update arrived to a select group of FSD Beta testers sooner than expected, prompting people to question whether Tesla has been serious about fixing the software problems. The NHTSA wanted the FSD Beta to be less like humans in disregarding the rules and instead follow the law to the letter. This means fully stopping at stop signs instead of just slowing down, not going straight from a left-only lane, and not traveling through an intersection during a stale yellow traffic light.

As many of those testing the new version uploaded videos on social media, it became clear that the FSD Beta has become better and smoother. But it also became more assertive, as a video by Chuck Cook showed how the FSD Beta cut in line at a highway exit. The software had no remorse, just like a real human. Other videos show the software becoming sloppier, making us question the OTA recall fix.

In a video uploaded by Whole Mars Blog, the software registers a distant green light and disregards the controlling red light, proceeding with a left turn when it shouldn't. Fortunately, no car was crossing through the intersection at the time. Otherwise, it could've ended badly. Even more puzzling is that the Tesla driver didn't notice the FSD's mistake. This shows that the FSD Beta can trick even people accustomed to using it into thinking they are safe when they are not.

This was not the only example of Tesla FSD misbehaving. In another video by Whole Mars Blog, the car repeats the mistake. Other videos show similar errors, and in some cases, the vehicle would not slow for a yellow light, although this is exactly what the software update was supposed to fix. Granted, this software version would probably not make it to the rest of FSD subscribers and paying customers. Elon Musk admitted so twice, saying it still needs polishing and another version bump before a wide release.

Nevertheless, we're still not sure these errors will be all ironed out, and no new ones will appear. Tesla had a strong reason for saying the FSD Beta is a Level 2 driver assist. With Level 2, the driver is always responsible if something bad happens. No car can drive autonomously today, and Tesla is making sure people understand that. Or, if they don't, it's entirely their fault.

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Cristian Agatie
Cristian Agatie profile photo

After his childhood dream of becoming a "tractor operator" didn't pan out, Cristian turned to journalism, first in print and later moving to online media. His top interests are electric vehicles and new energy solutions.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories