autoevolution
Car video reviews:
 
The Tesla Autopilot Syndrome: "Why Didn't the Car Avoid the Crash for Me?"
With Tesla's FSD Beta out and the first impressions in, people are getting a little overly excited. Sure, the new system appears to be a massive improvement over the regular Autopilot, but does that mean we'll be driving lying down on the backseat any time soon?

The Tesla Autopilot Syndrome: "Why Didn't the Car Avoid the Crash for Me?"

Tesla Model S on Autopilot vs mooseTesla Model S on Autopilot vs mooseTesla Model S on Autopilot vs moose
By the looks of it, no. The officials aren't keen on a massive roll-out of the technology on the street, and who can blame them? The decades when intelligent cars will share the road with the dumb ones we currently have are going to be a nightmare, so waiting until the former make up the majority before making Level 5 autonomy legal might actually be the smarter choice.

Plus, the technology needs to be ready, and we mean bulletproof-ready. That's because the moment the authorities give the green light, you won't see any more hands on the steering wheel or drivers paying attention. Hell, you won't even see drivers.

Predictably enough, some people are already doing it to some extent, despite everyone telling them not to. Take Jon Arne Pettersen's case, for example. Jon is a Model S owner from Norway who has the full driver's aid suite from Tesla installed on his vehicle. That includes the best hardware (HW3, as Tesla calls it) and the latest version of Autopilot with the Full Self-Driving package (not to be confused with FSD Beta, which is another thing).

While using the Autopilot on a two-lane road with visible markings, Jon was unlucky enough to have a moose jump in front of his car while doing 80 km/h (roughly 50 mph). The car only clipped the animal's left rear leg, so there was no excessive damage (see picture). The moose, however, had to be put down because his leg was too severely injured.

There are two things wrong with this incident. The first one is that Jon doesn't appear to react at all. He says there was not enough time, but we don't buy it. Even if he only saw the wild animal when it got onto the pavement, there was still time to at least hit the brakes or swerve slightly to the right. However, the clip appears to show there was no reaction until after the crash. What does that tell you? Yes, it seems like the driver wasn't paying full attention to the road, which is what happens to most people using the Autopilot probably because they begin to feel a bit redundant.

The second worrisome bit - something we'll call the "Autopilot Syndrome" from now on - is Jon kind of blaming the car for not having done something about it. In the video description, he says "No help with FSD, AP, and HW3 this time... Strange that my Model S didn't do anything to avoid this situation". The Autopilot is there to assist, not to drive the car so, if you think about it, the AI would be more entitled to say that about the driver, and not the other way around.

On the other hand, that was a moose right in front of the vehicle so, yeah, you would hope the Autopilot picked up on something that size that moves. The system has the knack to lull people into a false sense of security, before messing up and putting its hands in the air saying, "don't look at me, I never said I was perfect!" It's like that child that likes to brag a lot about how great they are but refuses to take any responsibility for any of the bad stuff it does. Nobody likes that kid.

It's not all bad, though. Anyone who had a similar encounter with a moose will tell you things could have ended a lot worse for Jon, but luckily, he got away just fine. He also turned out to be a great guy and was kind enough to upload the video on YouTube for everyone to see (it had previously been in a private Facebook group).



 
 
 
 
 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories