Sometimes, this leads to mass hysteria, like the one in China a year ago. Fearing that their Teslas might accelerate on their own and cause accidents, people started installing cameras in the footwell of their cars to have evidence that they were actually braking instead of accelerating. Things calmed down when Tesla threatened to sue people for defamation, and we haven’t heard of any sudden unintended acceleration (SUA) event in China since then.
Other people invested a lot more in trying to prove that Tesla’s automated-driving systems are dangerous and that they kill people. It’s enough to remember Dan O’Dowd’s campaigns, not to mention the constant stream of negative coverage from people that stand to lose if Tesla gets it right with their Full Self-Driving software. On the other hand, Tesla is adamant that the Autopilot saves lives and keeps the score of how much safer its automated-driving systems are compared to human drivers.
Nevertheless, every now and then, Tesla crashes do happen, as they do every single day with other car brands. Reporting a car crash is no longer news unless it’s a massacre or a Tesla is involved. You’d now understand that the level of emotions in the aftermath of the 2021 crash that killed two people in Springs, Texas, was sky-high. The investigation showed that the Tesla accelerated to 67 mph (108 kph) in a residential area where speed is limited to 30 mph (48 kph). The Tesla Model S only traveled about 550 feet (167 meters) before careening off the road at a curve, jumping a curb, and hitting a tree. A fire broke out, leaving the passengers no chance.
Police investigation showed that nobody was in the driver’s seat, and the driver was found in the back seat. This fueled speculations that the two people were testing the Autopilot at the time of the crash. Nevertheless, Elon Musk intervened back then and explained that the Autopilot could not engage on the type of road where the crash happened. Moreover, car telematics showed that the Autopilot wasn’t operating. Nevertheless, media reports still insisted that Tesla’s automated driving systems were to blame for the crash and subsequent loss of lives.
The National Transportation Safety Board (NTSB) started investigating the crash, and the preliminary findings contradicted police reports. The Board tried to replicate the conditions before the crash using another Tesla Model S. They concluded that the Autopilot could not engage on that section of the road, as Musk said. The investigation continued, and on February 8, almost two years after the crash, NTSB ruled out automated driving systems as the cause of the crash.
According to the NTSB investigation, the available data showed “no use of the Autopilot system at any time during this ownership period of the vehicle, including the time frame up to the last transmitted timestamp on April 17, 2021.” Another advanced driver assistance system (which couldn’t have been other than the FSD Beta) was also not in use, primarily because the car’s owner didn’t purchase it.
Instead, the NTSB cited driver impairment from alcohol intoxication in combination with the effects of two sedating antihistamines as the cause of the crash. The investigation also showed that “the driver was seated in the driver’s seat at the time of the crash and moved into the rear seat,” but “it was not possible to determine whether the doors were manually operational following the power loss.”