autoevolution
 

Here's Proof Tesla's FSD Beta 9.0 Street Testing Is a Calamity Waiting to Happen

Tesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and back
We can make all sorts of jokes involving Skynet and how Tesla's AI is trying to get rid of its human owner so it can take over the world, but at the end of the day, this is a very serious matter. Literally life and death, actually.
Climbing over a curb using Summon mode or scraping against a bush in FSD (Full Self Driving) mode can be fun for everyone but the car's owner, but no one will be laughing when the system's erratic behavior causes a high-speed crash with another vehicle, and a quick analysis of only a few videos out there will have you wondering by what miracle it hasn't already happened.

However, pointing the finger solely at Tesla and blaming everything on the carmaker isn't exactly fair. At the end of the day, it's the owners who make a conscious decision to activate the system, even though it's clearly not yet ready to operate on busy public roads. So, why are they doing it?

As far as we can tell, for two reasons, maybe three. First, most of them have paid $10,000 for access to FSD, so they naturally want to get their money's worth out of it. That also explains why some of the people we see in these clips are so deluded about the system's performance - they're subconsciously experiencing a form of buyer's remorse where they don't want to face the facts that FSD doesn't really offer Level 5 autonomy, not by a long shot.

Second, it's the fact they genuinely love Tesla and technology evolution and would like to feel part of it. By testing a product that will presumably end up being what it was promised it would, they'll get a sense of achievement when or if that happens.

Finally, it makes them feel like part of something special, an exclusive group of people collectively working toward moving mankind forward. Most of us will never experience that through our work or through anything that we do, so it's a unique opportunity to give your life some purpose on the bigger scheme.

Add all these together - as well as the promise of playing video games on your commute someday - and what you get is people acting against everything common sense tells you while also endangering everyone else around them on the road.

The first clip below shows a Tesla Model 3 attempting an unprotected left-turn over a double three-lane road with a divider. By all accounts, that's a difficult maneuver, especially for someone lacking experience who might struggle to approximate the speed of an approaching vehicle. Up until now, that someone would have been a learner driver, but if you think about it, that's precisely what the AI in Tesla's FSD is: a 16-year-old getting to grips with the way everything on the street works.

The tricky part about FSD, though, is that it speaks a totally different language than us, so not only does it need to learn about stop signs and road markings and whatnot, but we also need to figure out how to communicate with each other. It's not an easy task and, after years of bullish promises, it seems like even Elon Musk has come around to admitting that.

So, the Model 3 needs to cross three lanes of traffic before merging with the left-most lane on the other side, all in one (preferably smooth) move. How will it do? Well, the first three times, it just says "nope" and turns right instead, going around to have another go. The fourth time's a charm, apparently, but not if you were the driver of that silver SUV that had to slow down and move to the right of their lane to make sure it's in a position to avoid T-boning the Tesla that was really taking its time while crossing those three lanes.

OK, now that it's done it, that means it learned something and it's going to be smoother and smoother each time. That's what you'd think, but for the fourth attempt, the car was too anxious to go left so it just veers in the oncoming lane even before reaching the main road. The driver doesn't intervene, so the Tesla gets halfway across the three lanes and, despite the road being clear, it once again says "nope" and chooses the safety of a right turn instead.

We can't know for sure why the vehicle decided to repeat the same behavior from its fifth attempt for the sixth and seventh - maybe because the driver didn't intervene to correct it? - but it is scary to watch. Also, as the driver points out very well, it gives the car's cameras the worst possible angle to view traffic on the main road, so it makes absolutely no sense.

The second video shows another Tesla running FSD Beta 9 as it tries to merge onto a reasonably busy highway - a maneuver a human driver would have no problem pulling off. The system seems to ignore the first gap between a Jeep Wrangler and an Acura TLX and aim for the admittedly larger one in front of the Japanese sedan.

However, the cameras fail to pick the Chevrolet Equinox that was taking that spot by switching from the middle lane, so not the car is left in an awkward position. The merging area is about to end and it's going way too quickly because it was trying to get in front of the TLX. So, what does the system do?

Well, it slows down considerably blocking half of the off ramp as cars are forced to undertake on the right to avoid rear-ending the EV. Then, the FSD does what it always does when it can't follow the intended route: it reroutes to allow itself another go. At the end of the day, it's not the AI who might be in a hurry, so if the driver doesn't like it, they can grab the wheel at any time and allow the learner driver some time to smoke a cigarette and calm its nerves down.

EV drivers in general and Tesla drivers, in particular, aren't exactly the most popular out on the road, and even though that happens for all the wrong reasons, the kind of behavior shown by these FSD Beta 9 testing volunteers isn't going to help.

On top of that, at some point, something bad is going to happen, and even though Tesla won't be legally liable for it, we will all know it is the moral author for allowing regular Joes to perform beta testing for a critical safety system on public roads. Will it hurt Tesla's public image and sales? Probably not, but it's pretty damning for us as a species if that's all we care about anymore.

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Vlad Mitrache
Vlad Mitrache profile photo

"Boy meets car, boy loves car, boy gets journalism degree and starts job writing and editing at a car magazine" - 5/5. (Vlad Mitrache if he was a movie)
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories