autoevolution
 

The Tesla Model S That Caused a Pileup in San Francisco Had Full Self-Driving Engaged

A Tesla Model S that braked sharply, triggering an eight-vehicle crash on I-80 in San Francisco last November, had the EV maker’s FSD Beta engaged seconds before the crash. Information was confirmed by data the federal government released Tuesday.
The Tesla Model S that caused a pileup in San Francisco had Full Self-Driving engaged 7 photos
Photo: California Highway Patrol via The Intercept
Tesla Model S on Full Self-Driving presented an avoidable phantom braking episode that caused an eight-car pileup in San FranciscoTesla Model S on Full Self-Driving presented an avoidable phantom braking episode that caused an eight-car pileup in San FranciscoTesla Model S on Full Self-Driving presented an avoidable phantom braking episode that caused an eight-car pileup in San FranciscoTesla Model S on Full Self-Driving presented an avoidable phantom braking episode that caused an eight-car pileup in San FranciscoTesla Model S on Full Self-Driving presented an avoidable phantom braking episode that caused an eight-car pileup in San FranciscoTesla Model S on Full Self-Driving presented an avoidable phantom braking episode that caused an eight-car pileup in San Francisco
Tesla fans swear the Autopilot and Full Self-Driving automated driving systems are better than human drivers. The FSD Beta software is pretty expert at finding the way in various traffic situations, and sometimes it behaves like an expert driver for hours in a row. This leads people to believe the system is good enough to rely on for everyday “driving.” Then, people get careless, attention slips, and, at the worst possible moment, they get a brutal reminder that FSD Beta is called “beta” for a reason.

When this happens, it can be tragic because often, there isn’t enough time for the human driver to react to a dangerous situation that the FSD Beta software cannot deal with. This must’ve been the case in the November pileup crash on the I-80 east of the Bay Bridge in San Francisco. If you recall, a Tesla Model S driving on the interstate moved into the far-left lane and then braked abruptly, leaving little time for drivers coming from behind to react. Eight cars were involved in the crash, with nine injured, including a two-year-old child.

The surveillance camera footage from the crash scene shows that there was nothing in front of the Tesla to justify braking. It was a clear case of what people have come to call “phantom braking,” a phenomenon many Tesla drivers complained about. The Autopilot/FSD system would trigger a hard brake for no apparent reason, with dire consequences, as it happened on Thanksgiving Day. After the crash, the driver said the car was driving with the FSD Beta on. It was, ironically, hours after Elon Musk announced FSD would be available to anyone who paid for the feature.

Although drivers tend to blame the FSD for their mistakes when a crash happens, this time, the driver was right, as data released by the federal government on Tuesday confirms. According to the investigation’s report cited by CNN, the controversial driver-assist software was activated roughly 30 seconds before the crash. The data also shows that the car abruptly slowed to 7 mph (11 kph), a dangerous move in fast-moving traffic.

It’s not certain what causes the phantom braking, which Tesla has yet to figure out and fix. Tesla has stripped its cars from every sensor except video cameras, which might be the main cause. After all, humans experience optical illusions, although rare. It could be that certain conditions, like a shadow moving fast across the camera, could trick the system into thinking there’s an object in front of the car and trigger a hard brake.

The NHTSA is already investigating hundreds of complaints from Tesla drivers, some describing near crashes and concerns for their safety. Nevertheless, the agency has not yet taken action against Tesla, and the investigation lingers. Regardless, analysts expect the recent findings in the San Francisco crash to prompt NHTSA to demand a fix. A recall of Tesla’s driver-assist features could be in store, although not guaranteed.

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram X (Twitter)
About the author: Cristian Agatie
Cristian Agatie profile photo

After his childhood dream of becoming a "tractor operator" didn't pan out, Cristian turned to journalism, first in print and later moving to online media. His top interests are electric vehicles and new energy solutions.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories