Jennifer Homendy said in September 2021 that Tesla “has clearly misled numerous people to misuse and abuse technology.” The National Transportation Safety Board (NTSB) chair was referring to how the EV maker deals with its advanced driver-assistance system (ADAS): Autopilot and FSD. Both are under investigation by the National Highway Traffic Safety Administration (NTHSA) due to crashes. A new one happened on July 24 in Draper, Utah. One person died.
According to the Utah Highway Patrol (UHP), the crash happened at about 1:10 AM on the I-15. A motorcyclist was hit from behind on the road by an unidentified Tesla model. The impact threw the rider away from the motorbike, and this person was pronounced dead at the scene. The UHP did not release the victim's identity but said there was nothing this person could have done to prevent the crash. Gephardt Daily made some pictures at the collision location, but we could not discover which Tesla hit the rider. The headlights suggest it was a Model 3 or a Model Y.
On the other hand, the Tesla driver said they did not see the motorcyclist and that they were using Autopilot. The UHP corporal Michael Gordon told KSL that the system is no excuse for the driver not to pay attention to the traffic because it still requires human supervision and intervention. Although Tesla now repeats that in the driver’s manual and formal communication, Elon Musk and the company have already said their cars “pretty much drive themselves.” This practice is called autonowashing: believing that a vehicle that does not have autonomous driving capabilities can drive itself.
Homendy has also said that Tesla’s naming strategy by calling its pieces of software Autopilot and Full Self-Driving is “misleading and irresponsible.” She believed it generated overreliance on the systems, which was pointed out as the cause for several crashes involving the system, including the first fatal one, on May 7, 2016. The fact that the driver that hit this rider in Utah admits that they did not see the victim and that Autopilot was turned on shows this may also be the case in this situation. A recent tweet from ChinaRider shows how serious this is.
Expect NHTSA to include this fatal crash among the incidents it is investigating with Autopilot. They are more than 16 at this point, only involving emergency vehicles: the safety regulator upgraded the preliminary evaluation to an engineering analysis, which precedes a recall. NHTSA seems to be paying careful attention to automatic emergency braking (AEB), which only intervened in half of the 16 collisions. The safety regulator certainly wants to know if it worked in this crash in Utah or not.
On the other hand, the Tesla driver said they did not see the motorcyclist and that they were using Autopilot. The UHP corporal Michael Gordon told KSL that the system is no excuse for the driver not to pay attention to the traffic because it still requires human supervision and intervention. Although Tesla now repeats that in the driver’s manual and formal communication, Elon Musk and the company have already said their cars “pretty much drive themselves.” This practice is called autonowashing: believing that a vehicle that does not have autonomous driving capabilities can drive itself.
Homendy has also said that Tesla’s naming strategy by calling its pieces of software Autopilot and Full Self-Driving is “misleading and irresponsible.” She believed it generated overreliance on the systems, which was pointed out as the cause for several crashes involving the system, including the first fatal one, on May 7, 2016. The fact that the driver that hit this rider in Utah admits that they did not see the victim and that Autopilot was turned on shows this may also be the case in this situation. A recent tweet from ChinaRider shows how serious this is.
Expect NHTSA to include this fatal crash among the incidents it is investigating with Autopilot. They are more than 16 at this point, only involving emergency vehicles: the safety regulator upgraded the preliminary evaluation to an engineering analysis, which precedes a recall. NHTSA seems to be paying careful attention to automatic emergency braking (AEB), which only intervened in half of the 16 collisions. The safety regulator certainly wants to know if it worked in this crash in Utah or not.
I’m speechless. The biggest abuse of the Autopilot system I’ve seen so far. @WholeMarsBlog @DirtyTesLa @DrKnowItAll16 @ray4tesla
— ????????ChinaRider???????? (@TeslaChinaRider) July 24, 2022
( https://t.co/pRShAx1RKa ) pic.twitter.com/teNgqmBtK6