NHTSA (National Highway Traffic Safety Administration) allows any customer to complain about defects in their cars. Possibly for the first time, one of them went to the safety agency website to report a crash allegedly caused by FSD (Full Self-Driving) Beta.
The unidentified customer seems to live in Brea, California, and has a Tesla Model Y RWD, but we are not sure if it is a Standard Range or a Long Range. The fact is that they were driving with the Level 2 ADAS (advanced driver assistance systems) engaged and, in a left turn, FSD would have decided to invade the nearby lane.
According to the report published on the NHTSA website, the car sounded a warning. However, it was halfway through the turn. The driver tried to steer the vehicle back to the correct lane, but the car would have taken control and “forced itself into the incorrect lane(,) creating an unsafe maneuver (and) putting everyone involved at risk.”
The car that was in the invaded lane did not have time to avoid the crash. With that, the Model Y would now be “severely damaged on the driver side.” Without a name or any other reference, such as the location in which the crash happened, we have no way to check if the story is true or not. Despite that, it was relevant enough to deserve an article, which may be a way to clarify everything.
On top of that, NHTSA said that it might remove complaints that it is unable to verify. The report against FSD Beta is still online. If you know what happened or if you were directly involved with the crash, please get in touch with us to tell your story with all the necessary details.
This is not the first reported dangerous situation caused by FSD when turning left. Many videos published on YouTube report that the car “tried to kill” its driver and passengers by entering in front of vehicles approaching in the opposite lanes despite detecting them. Luckily, none of these other incidents resulted in a crash. Based on one of those videos, Philip Koopman asked NHTSA to intervene and prevent Tesla from testing beta software with untrained customers.
If the NHTSA complaint is for real, it may be the first crash involving FSD Beta. Tesla fans will argue that anyone using the beta software accepts a disclaimer that states the driver is responsible at all times, which is true. They may also say that the EV will not fight for control, which is debatable. After all, FSD can steer, brake, and accelerate the cars without driver's interference.
The point here is that this driver considers FSD to be dangerous. If NHTSA agrees with their report and confirms the story, it may concede what Koopman and other traffic safety experts have been asking for ages: that Tesla follows the same rules with FSD that all other Level 4 developers have to obey, regardless of what Tesla tells regulators.
According to the report published on the NHTSA website, the car sounded a warning. However, it was halfway through the turn. The driver tried to steer the vehicle back to the correct lane, but the car would have taken control and “forced itself into the incorrect lane(,) creating an unsafe maneuver (and) putting everyone involved at risk.”
The car that was in the invaded lane did not have time to avoid the crash. With that, the Model Y would now be “severely damaged on the driver side.” Without a name or any other reference, such as the location in which the crash happened, we have no way to check if the story is true or not. Despite that, it was relevant enough to deserve an article, which may be a way to clarify everything.
On top of that, NHTSA said that it might remove complaints that it is unable to verify. The report against FSD Beta is still online. If you know what happened or if you were directly involved with the crash, please get in touch with us to tell your story with all the necessary details.
This is not the first reported dangerous situation caused by FSD when turning left. Many videos published on YouTube report that the car “tried to kill” its driver and passengers by entering in front of vehicles approaching in the opposite lanes despite detecting them. Luckily, none of these other incidents resulted in a crash. Based on one of those videos, Philip Koopman asked NHTSA to intervene and prevent Tesla from testing beta software with untrained customers.
If the NHTSA complaint is for real, it may be the first crash involving FSD Beta. Tesla fans will argue that anyone using the beta software accepts a disclaimer that states the driver is responsible at all times, which is true. They may also say that the EV will not fight for control, which is debatable. After all, FSD can steer, brake, and accelerate the cars without driver's interference.
The point here is that this driver considers FSD to be dangerous. If NHTSA agrees with their report and confirms the story, it may concede what Koopman and other traffic safety experts have been asking for ages: that Tesla follows the same rules with FSD that all other Level 4 developers have to obey, regardless of what Tesla tells regulators.