Researchers Hack Autopilot Software Revealing Massive Potential Backdoor

Oh my god, Elon Musk must be furious right now. After all the unwanted attention his semi-autonomous driving system has had lately after the fatal crash on May 7, this was the last thing he needed.
Tesla Model S on Autopilot 1 photo
Photo: Screenshot from YouTube
He must be standing in his office wondering why the hell the researchers from the University of South Carolina and Zehjiang University in China picked a Tesla as their lab rat. Well, he probably knows the answer very well, and that's the flip side of the coin when you're the first to offer a certain type of technology.

Well, there's no point in lingering over the motifs these scientists had in choosing a Tesla, but it's much more interesting to note what they've found out. During this year's DEFCON Hacking Conference in Las Vegas, they showed how using a bag of tricks, it is possible to fool the Autopilot system into believing some objects that don't really exist are there, as well as ignoring those that sit right in front of it.

Needless to say, this can pose a serious safety threat to anyone using the feature. The Autopilot works with three types of receptors including radar, video cameras, and ultrasonic sensors. While the researchers attacked all three of them, the only real threat to the safety of those inside is the radar, which is used for high-speed cruising. An extensive report published by Wired said that using just two pieces of equipment - one of which did cost $90,000 - the hackers were able to make a cart placed in front of the Tesla "disappear" as far as the car's sensors were concerned.

The good news? Well, there are two of them: first, fully-autonomous cars aren't ready to roll out yet, so whatever the hackers did the driver would still be able to control the car manually - which he's supposed to be prepared to do at any given moment. And second, it sounds like the jamming equipment is quite expensive, so somebody would have to want to cause a crash more than buy themselves a new car, so this danger would probably only apply to important, influent targets. Which, no matter how high our megalomania can get sometimes, we're definitely not.

The fact that Tesla pays hackers to find flaws in its systems is no secret, but this endeavor wasn't backed by the Californian EV maker. However, Tesla did comment on the findings: “We appreciate the work Wenyuan and team put into researching potential attacks on sensors used in the Autopilot system. We have reviewed these results with Wenyuan’s team and have thus far not been able to reproduce any real-world cases that pose risk to Tesla drivers.”
If you liked the article, please follow us:  Google News icon Google News Youtube Instagram Twitter
About the author: Vlad Mitrache
Vlad Mitrache profile photo

"Boy meets car, boy loves car, boy gets journalism degree and starts job writing and editing at a car magazine" - 5/5. (Vlad Mitrache if he was a movie)
Full profile


Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories