autoevolution
 

Researchers Trick Tesla’s Autopilot to Change Lanes, Steer Into Traffic

Tesla Model S tricked into driving into traffic by "interference stickers" 15 photos
Photo: YouTube / Tencent's Keen Security Lab
Tesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually RestrainedTesla Model S by Prior Design Is Unusually Restrained
Tesla recommends that drivers keep their hands on the wheel at all times, even when the vehicle is in Autopilot mode. Despite its name, this mode does not make the car self-driving, the automarker says.
If it did, it would be in a world of trouble, because 3 small squares placed directly on the tarmac are enough to steer a Tesla into the opposite lane, directly into oncoming traffic. This “fake-lane” proof-of-concept attack was performed by researchers at Tencent's Keen Security Lab, with the goal of showing some of the weaker spots of assisted driving and, ultimately, autonomous cars.

Their findings were published at the end of last month, and also included conclusions on how a Tesla (Model S, in this particular case) can be tricked into starting the windshield wipers when it’s not raining or how the car can be remotely steered with a game controller.

For the fake-lane attack, researchers used 3 “interference stickers”: 3 white stickers that they placed on the road, showing that they’re enough to trick Autopilot’s vision module into “seeing” them as legitimate road markers and consequently steering the car into the other lane.

“This kind of attack is simple to deploy, and the materials are easy to obtain,” the researchers write. “Tesla uses a pure computer vision solution for lane recognition, and we found in this attack experiment that the vehicle driving decision is only based on computer vision lane recognition results. Our experiments proved that this architecture has security risks and reverse lane recognition is one of the necessary functions for autonomous driving in non-closed roads.”

At the time the paper was published, Tesla said that this was “not a real-world concern.” In a statement to BleepingComputer, the automaker stresses that “the physical environment around the vehicle [had been] artificially altered,” which made Autopilot behave abnormally. “There is no reason for a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so.”

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Elena Gorgan
Elena Gorgan profile photo

Elena has been writing for a living since 2006 and, as a journalist, she has put her double major in English and Spanish to good use. She covers automotive and mobility topics like cars and bicycles, and she always knows the shows worth watching on Netflix and friends.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories