autoevolution
 

Autonomous Driving Ph.D. Shows Why Tesla FSD Is Not Safe

Logic is a fantastic tool as long as you get your premises right. If you start your reasoning from something completely false, your conclusions will be wrong as well. An excellent example of that is all Tesla investors that insist having FSD on public roads is perfectly fine because few crashes have happened so far. Mahmood Hikmet started a video series that shows why the premise is wrong – and why that matters.
Mahmood Hikmet explains why FSD is not safe for public road testing 16 photos
Photo: Mahmood Hikmet
The Dawn Project wants Tesla to stop testing its beta autonomous driving software on public roadsThe Dawn Project wants Tesla to stop testing its beta autonomous driving software on public roadsTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Autopilot and FSDTesla Safety Score BetaTesla Safety Score BetaTesla Safety Score BetaTesla Safety Score BetaTesla FSB Beta Request DisclaimerTesla's Request Button for FSD Beta and How It Is Doing on Public Roads
One of the main arguments for Tesla investors to say Full Self-Driving is safe is that it did not have any crashes. That was debunked a long time ago: we have counted at least three wrecks involving the beta system until now. The new discourse is that nobody died when it was active. It was the same argument people brandished for Autopilot until Joshua Brown died using it in 2016. Tesla investors then accused him of being reckless.

Hikmet uses these examples to explain the concept of moral crumple zones. He even created a handy graphic that shows the strategy of taking credit for when something works and blaming the customers when the outcomes are tragic. However, the most important idea in the video relates to why Tesla should not be testing FSD on public roads with untrained drivers.

Using a graphic elaborated by Philip Koopman, Hikmet shows that the lack of crashes is not an intrinsic quality of FSD. It is quite the opposite: multiple accidents with it have been prevented because it basically sucks. Even die-hard Tesla fans are amazed at how bad it is and just refrain from using it. Those who insist on trying it often do it alone: their partners refuse to travel with them when the software is active. It is just too dangerous.

As it is often said, correlation is not causation. This extra care makes the system seem safe when it is actually attentive human drivers that are making it look good and reliable. The risk is that things may slightly improve, giving FSD users false hopes about the software for which they paid up to $12,000 with no guarantees they would ever be able to use it.

When (and if) FSD gets better, people may actually believe it is able to make cars drive autonomously. That’s the most treacherous of all situations: if people start to trust FSD, they will allow it to make decisions even when it is clearly not ready to do so. That’s what Joshua Brown did with Autopilot back in 2016.

Summing up, the responsible approach would be to test FSD only internally until it is ready to be deployed. It would also be morally advisable to give people their money back for FSD. Stopping with the promises about it being capable of putting 1 million robotaxis on the road would be a natural consequence, but that’s another discussion.

The subject is so serious that the software engineer Dan O’Dowd even used it as a political platform: he is running for the Senate, promising to end risky software in safety-critical products and infrastructures. FSD is the primary example O’Dowd presents. He created The Dawn Project and paid for an entire page in the NYT to attack the software before he revealed he had political aspirations. Mixing the safety concern with these ambitions may weaken the arguments the engineer presents.

Hikmet does not have any personal goal other than clarifying the situation. He even presents a very valid comparison for people to realize how serious the FSD matter is. Just imagine it was a cardiac pacemaker. If people are not ok with trusting their lives with something that still lacks enough testing, explaining why FSD should not be on public roads is unnecessary: it should be obvious at this point.

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram X (Twitter)
About the author: Gustavo Henrique Ruffo
Gustavo Henrique Ruffo profile photo

Motoring writer since 1998, Gustavo wants to write relevant stories about cars and their shift to a sustainable future.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories