Tesla fans and advocates were still complaining about the Full Self-Driving (FSD) recall when a Model S struck the back of a fire truck on I-680 at 3:54 AM on February 18. The driver died at the scene, and a passenger had to be extricated and taken to the hospital with severe injuries. Four firefighters also needed medical care. Why is this news and not just an ordinary crash? Because this is yet another collision against an emergency vehicle.
Tesla has been under investigation since August 13, 2021, for such 11 while Navigate on Autopilot was in use. These collisions hurt 17 people and killed one. When the preliminary evaluation PE21-020 turned into the engineering analysis EA22-002 in June 2022, the National Highway Traffic Safety Administration (NHTSA) had already included five more impacts to the case. The FSD recall NHTSA said Tesla had to perform is the first concrete result from this engineering analysis.
Considering the crash happened on the weekend and that the passenger that survived may still not have talked to the police, there is no information about the use of Autopilot. Although that is not unlikely, we can’t discard other factors that may have played an essential role in the fatal collision.
Have a look at where the collision happened on the map below. You’ll see it was impossible not to see the fire truck parked diagonally in the two fast lanes of I-680 for at least a quarter mile, if not more. A driver in perfect condition would not have crashed against the fire truck, with its emergency lights on and probably other road signs to try to prevent what unfortunately happened.
Being so early in the morning, the driver could have accidentally fallen asleep. They could also be going home from a party, so the police have to be sure the driver was not under the influence of any substance. More than anything, the investigators must be sure Autopilot was not working before or while the Model S hit the fire truck.
If that were the case, it would not be the first time drivers trusted the system more than they should. All the first three fatalities involving the beta software involved overreliance. That said, they could have slept or abused alcohol or other substances while believing that Autopilot would drive them home safely. No car currently for sale has a true autopilot or is fully self-driving. In other words, you cannot purchase an autonomous vehicle nowadays. Elon Musk said people could do that by 2020, but he missed the deadline – it was also not the first time.
The California Highway Patrol and the NHTSA must have already contacted Tesla. The company can verify what happened to the car before the crash. The wreckage must have also been taken for forensic analysis. The investigators must have taken pictures of the location to confirm if the EV braked before hitting the truck.
The images show damages to the right side of the fire truck, but that must have been a secondary collision. Considering how badly the Model S looks in the pictures taken by the Contra Costa County Fire Protection and that the fire truck needed a tow truck to be removed from the road, it would be safe to say that the crash was the only thing stopping the electric sedan.
Supposing Tesla manages to prove Autopilot had nothing to do with this, the crash will just be an unfortunate example of whatever may have caused it. There are so many warnings about not driving when you are tired or under the influence of anything that these crashes should not happen anymore. Sadly, warnings are not as effective as they should be.
If Tesla’s advanced driver assistance system (ADAS) had been on when the crash occurred, it would be the second confirmed death involving collisions with emergency vehicles and not much more than that, at least in the short term. Tesla allegedly fixed that with the ove-the-air (OTA) update 2021.24.12. In its release notes, it said the Model 3 and Model Y would automatically reduce the driving speed and warn the driver about the reduction. It is not clear if that was restricted to these two vehicles. Again, the car that crashed was a Model S.
NHTSA is still investigating the company, and it will not rush its conclusions and actions because of the new case. Several safety specialists have already urged the safety agency to remove FSD and Autopilot from Tesla’s vehicles or impose the same requirements autonomous driving companies have to meet. So far, NHTSA has not complied with any of those requests. Missy Cummings, who joined the agency as a senior advisor, left NHTSA in December.
What Tesla should fear the most at this point is the reputation damage these cases will bring to its technology. Apart from its investors and die-hard fans, few people still believe that the company will ever deliver an autonomous vehicle or that these systems make Tesla cars safer than its competitors. The company’s Safety Report has already been debunked. Still, new crashes involving the ADAS – especially when people die on them – will probably force the EV maker to spend big money trying to revert the image erosion these crashes represent. Let’s wait for what the investigators have to say.
Considering the crash happened on the weekend and that the passenger that survived may still not have talked to the police, there is no information about the use of Autopilot. Although that is not unlikely, we can’t discard other factors that may have played an essential role in the fatal collision.
Being so early in the morning, the driver could have accidentally fallen asleep. They could also be going home from a party, so the police have to be sure the driver was not under the influence of any substance. More than anything, the investigators must be sure Autopilot was not working before or while the Model S hit the fire truck.
The California Highway Patrol and the NHTSA must have already contacted Tesla. The company can verify what happened to the car before the crash. The wreckage must have also been taken for forensic analysis. The investigators must have taken pictures of the location to confirm if the EV braked before hitting the truck.
Supposing Tesla manages to prove Autopilot had nothing to do with this, the crash will just be an unfortunate example of whatever may have caused it. There are so many warnings about not driving when you are tired or under the influence of anything that these crashes should not happen anymore. Sadly, warnings are not as effective as they should be.
NHTSA is still investigating the company, and it will not rush its conclusions and actions because of the new case. Several safety specialists have already urged the safety agency to remove FSD and Autopilot from Tesla’s vehicles or impose the same requirements autonomous driving companies have to meet. So far, NHTSA has not complied with any of those requests. Missy Cummings, who joined the agency as a senior advisor, left NHTSA in December.
What Tesla should fear the most at this point is the reputation damage these cases will bring to its technology. Apart from its investors and die-hard fans, few people still believe that the company will ever deliver an autonomous vehicle or that these systems make Tesla cars safer than its competitors. The company’s Safety Report has already been debunked. Still, new crashes involving the ADAS – especially when people die on them – will probably force the EV maker to spend big money trying to revert the image erosion these crashes represent. Let’s wait for what the investigators have to say.