Lakafossis explained that the SUA events in Tesla vehicles do not happen due to hardware issues. The safeguards Tesla adopted – which made Jason Hughes state it was impossible for a Tesla to present such incidents – are indeed very effective. In an investigation of a Tesla Model Y crash in Athens, the engineer received an amazing insight from the person involved with it.
The engineer interviewed the Model Y owner with the help of the video to try to understand everything he did in each situation the images depicted. When Lakafossis reached the part where the brake lights turned on, saying the driver braked, the Tesla owner immediately corrected the engineer: there was "no need" for him to brake the car because it always did that on its own. This driver loved that his Model Y took care of braking in his place.
Let's start with the obvious: you cannot see the pedals in which you step while driving. Some may think it is enough to know the accelerator pedal is on the right, and the brake pedal is on the left. That puts proprioception (or kinesthesia) as a central component of automatic pedal pressing. Feel capacity would also play a role, but most shoes do not allow you to feel the pedals. The problem is that there are cases where the drivers misapplied the pedals and accelerated their vehicles while they were sure they had stepped on the brakes.
On November 5, 2022, a former truck driver tried to control his Model Y for 2.6 kilometers (1.6 miles) until a crash stopped it. Two people died on the way. Identified solely as Zhan, the then 55-year-old driver said he pressed the brake pedal, and it did not work. Tesla said he was pushing the accelerator pedal instead.
To explain that, Lakafossis used concepts of neuroscience and Control Theory, such as closed-loop and open-loop controls. The latter does not depend on feedback. When it is used to describe human activities, it relates to things that you cannot adjust, such as throwing a ball into a basket or a dart on a board. After you execute them, there is no way to change them. The former is the opposite: any action depends on feedback, and correction is constant.
In other words, if you are accelerating and have to park, you will naturally move your foot from the accelerator pedal to the brakes. If you expect your car to do that for you, you just remove your foot from the accelerator pedal and keep it over the floorpan. Lakafossis believes this is what Tesla vehicles induce their drivers to do. When the car fails to brake on its own, that may generate a pedal misapplication that the driver will swear was correct. Worse: they will keep pushing the accelerator pedal, willing for the car to stop.
However, Ashok Elluswamy once tweeted that "Autopilot prevents 40 crashes/day where human drivers mistakenly press the accelerator at 100% instead of the brakes." He even presented a video of Autopilot allegedly saving someone's legs from being crushed. Hopefully, Tesla's Autopilot software director asked permission to share the images publicly from the people involved. After all, the BEV maker is being accused of sharing private videos from its customers without their knowledge. I'd ask Tesla if that was the case with Elluswamy's tweet, but the BEV maker does not talk to the press. It is also not related to the main topic of this story, only an interesting side note.
The first is that the SUA incidents Elluswamy said Autopilot prevents every day could happen with any car from any brand. If that is the case, the engineer argues we should hear about SUA crashes daily – at least 40 from each brand that does not have Autopilot to avoid them. We don't.
This is not the first time Elluswamy has been candid about issues involving the BEV maker. I've already written about how he said, under oath, that he was not aware of what defines an Operational Design Domain (ODD) and was also not familiar with perception-reaction time, two crucial concepts of autonomous driving research. It was on the same deposition in which the Autopilot director confirmed that a 2016 video claiming Autopilot drove a car with no disengagements was staged, but that's also another story.
Model Y in Athens to perform tests in December 2022. He confirmed the car did not behave in a linear way, with predictable responses in all situations. It would coast, slow down, or brake randomly. In February 2023, he installed a VBox in another Tesla and could see again that the vehicle presented different responses even when he drove the car around the same block multiple times.
"The actual cause of the problem is the stochastic (nonlinear) nature of automatic braking, or else, the way that a Tesla car decides to brake by itself in response to the road conditions in a non-standard timing and manner. This is what causes confusion, and this is what is very different in Tesla cars in comparison to standard ICE cars or one-pedal regenerative braking in a BEV."
"Allowing reverse to be selected while moving forward is not the actual cause of the problem, but it is a quick and easy way to fix it – by forcing the driver to move their foot toward the brake pedal."
If Tesla adopts BTSI – voluntarily or with an NHTSA push – it creates a step in which putting your foot on the brakes is mandatory. That keeps the feedback loop sequence active and reduces the risk of pedal misapplication without forcing Tesla to review the entire Autopilot software to prevent its erratic behavior. Lakafossis thinks a creep mode would also help a lot but said that "this option is grey on the menu of recent cars, which means you cannot select it anymore."
"The problem with unexplained SUA accidents is the absolute belief of a young, alert, and competent driver that he was pressing the brake pedal all along. That's what I realized in the minor accident I was asked to investigate, and this is what led me to seek answers in neuroscience and not in engineering."
Tesla products could be the way they currently are. The deal is that they did not need to be like this.
"What I don't understand is why Tesla made its cars this way. There is no gain in avoiding BTSI or adopting this random braking feature. It is a decision that puts marketing before safety and which has already cost four lives – at least that I am aware of. One article said I was a Greek engineer trying to make Tesla go bankrupt, but I have no interest in any of that or in the stock market. A simple over-the-air update and two lines of code can solve this once and for all! Would that bankrupt Tesla?"
"If I talked to the Greek transportation department, they would probably just check if everything with homologation was correct and deny a correction request. Discussing this with the European Union would probably lead to similar results. I tried to talk to the lawyer who is dealing with the Paris crash, but she said she had her own investigators and did not want to hear from me. It was then that I learned that anyone could write a petition to the NHTSA to ask for safety corrections in cars, so that's what I did."
The engineer said he has also submitted his work to the Society of Automotive Engineers (SAE) for publication. He believes that it can help develop safer cars in the future by putting the behavioral matters involved with SUA under the spotlight.
"The active cruise control recall does not explain the SUA accidents in question. It may prevent a totally different type of accident, but not these particular crashes. If you engage the cruise control by mistake, you will realize that something is not normal, and you will step on the brake pedal. If you have more than two seconds before you hit something, you will probably manage to brake in time."
According to Lakafossis, unless a feedback loop sequence is broken in accidentally activating TACC, there is no reason for a capable driver to misapply the pedals. With his theory, the engineer may have solved this mystery for good, with pieces of evidence that Tesla itself has given him. Even if it is not accepted, the measures he asked NHTSA to impose will not hurt anyone. On the contrary: saving lives is their main goal – and that deserves all credit.
These predictions are already used to prevent a lot of collisions. For e.g., Autopilot prevents ~40 crashes / day where human drivers mistakenly press the accelerator at 100% instead of the brakes. In the video Autopilot automatically brakes, saving this person's legs (7/12) pic.twitter.com/XtMssPT9cM— Ashok Elluswamy (@aelluswamy) August 21, 2022