autoevolution
 

Autopilot Is a Well Designed System with No Possible Exploits. In Bizarro World

To be completely fair to Tesla and Elon Musk, it's probably impossible to develop a software and hardware system that is 100 percent foolproof against all exploits.
Tesla Model 3 on Autopilot with "driver" in the backseat 1 photo
Photo: Ingineerix/YouTube screenshot
If it weren't, then things like antivirus software and cybersecurity wouldn't have to exist because our computers would be made impenetrable from the factory. Yet not only do they exist, but they also require periodic updates to remain at least marginally relevant, which is to say strong enough to keep the rookie hackers at bay.

It's the same with the Autopilot. Tesla does employ a few safeguards that somebody within the company must have considered to be enough at some point, but time and time again, clips of people abusing the driver's aid suite surface on the web - that's if they don't make the news. And when that happens, you immediately think of all the instances that take place without anyone knowing because not everybody is dumb enough to film themselves committing a traffic violation and posting it online.

Well, this particular person didn't do that, but they still ended up on YouTube thanks to another driver that passed by and decided to make the Autopilot abuser famous. Not that he wasn't already, having featured on Reddit only days before while lying in the back seat and providing input to the steering wheel with his feet. This second time, he just sat and stared at the person filming him, presumably pretty proud of his accomplishment.

The uploader goes out of their way to state that "there is nothing wrong with Tesla AP" and that it's the driver assistance system that's being abused. We agree completely with the second part of the disclaimer, and while it's not Tesla's fault people are going out of their way to find new ways of tricking Autopilot while they slip out of the driver's seat, it should be its job to make life harder for these people.

Some voices say it's the same as putting the car in "D" and placing a rock over the throttle pedal - are the rest of the manufacturers doing anything to prevent owners from doing that? No, because it's not the same thing at all. Everyone in their right mind knows what would happen if they did that, and how quickly the aftermath would follow.

With the Autopilot (more particularly the $10,000 Full Self-Driving suite), they're told the car can drive on its own and it's just the regulations that are still preventing a full unlocking of the technology. What's more, a driver can activate the system and see for themselves that it's working, gathering some first-hand confidence into the system. The more they do it, the less relevant Tesla's message urging them to always remain alert and able to resume control becomes.

All things considered, we'd like to know this: why don't we see clips of people doing something similar in cars from other brands? Is their technology really too far behind? We sincerely doubt that, especially out on highways, where traffic is a lot less unpredictable. So maybe there are other reasons.

Maybe these companies aren't so desperate to be the industry's technology leaders, placing more importance on keeping an unblemished image. Well, maybe "unblemished" is too high of an ask for most traditional carmakers (they've all done something regrettable in the past), but at least they're striving not to add any more stains. Instead, they're waiting for the legislation to fall into place before presenting their systems (with some of them being already pretty advanced) as anything more than advanced driver's aid suites or talking about full autonomy in any other tense than the future. They prefer to seem more backward than to allow their customers to take advantage of these features.

Whoever says Tesla can't do anything to make it harder for these people to do stuff like this is probably not aware of how contradicting the whole situation is. Are we really to believe that a company employing the same minds who have allegedly cracked the self-driving problem can't come up with a few stronger safeguards that would keep its users from abusing the system? Or is this just another example of "every publicity is good publicity?" We'd hate to think Tesla could be so cynical, especially since people can lose their lives as a result, and not just the ones asking for it.

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram X (Twitter)
About the author: Vlad Mitrache
Vlad Mitrache profile photo

"Boy meets car, boy loves car, boy gets journalism degree and starts job writing and editing at a car magazine" - 5/5. (Vlad Mitrache if he was a movie)
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories