On the other hand, the Tesla driver said they did not see the motorcyclist and that they were using Autopilot. The UHP corporal Michael Gordon told KSL that the system is no excuse for the driver not to pay attention to the traffic because it still requires human supervision and intervention. Although Tesla now repeats that in the driver’s manual and formal communication, Elon Musk and the company have already said their cars “pretty much drive themselves.” This practice is called autonowashing: believing that a vehicle that does not have autonomous driving capabilities can drive itself.
Homendy has also said that Tesla’s naming strategy by calling its pieces of software Autopilot and Full Self-Driving is “misleading and irresponsible.” She believed it generated overreliance on the systems, which was pointed out as the cause for several crashes involving the system, including the first fatal one, on May 7, 2016. The fact that the driver that hit this rider in Utah admits that they did not see the victim and that Autopilot was turned on shows this may also be the case in this situation. A recent tweet from ChinaRider shows how serious this is.
Expect NHTSA to include this fatal crash among the incidents it is investigating with Autopilot. They are more than 16 at this point, only involving emergency vehicles: the safety regulator upgraded the preliminary evaluation to an engineering analysis, which precedes a recall. NHTSA seems to be paying careful attention to automatic emergency braking (AEB), which only intervened in half of the 16 collisions. The safety regulator certainly wants to know if it worked in this crash in Utah or not.