autoevolution
 

Engineer Who Asked NHTSA to Recall All Teslas Explains the Issue Everybody Missed

A Model S crashed against a house in Memphis, Tennessee, in October 2019 37 photos
Photo: Visanji T. Gala
Tesla Model 3 crashes against a wall in a parking garage due to sudden unintended accelerationTesla Model 3 crashes against a wall in a parking garage due to sudden unintended accelerationTesla Model 3 crashes against a wall in a parking garage due to sudden unintended accelerationTesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengageTesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengageTesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengageTesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengageTesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengageTesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengageTesla Model 3 Crashes in Paris Allegedly Due to Sudden Unintended AccelerationTesla Model 3 Crashes in Paris Allegedly Due to Sudden Unintended AccelerationThis is how the Tesla gear selection lever is operatedTesla Model Y crashed in Chaozhou and killed two people: what caused this?Tesla Model Y crashed in Chaozhou and killed two people: what caused this?Tesla Model Y crashed in Chaozhou and killed two people: what caused this?Tesla Model Y clearly shows brake lights turned on without any obstacles ahead: the driver stepped on the brakesTesla Model Y crashed in Chaozhou and killed two people: what caused this?Tesla Model Y crashed in Chaozhou and killed two people: what caused this?Tesla Model 3 smashes through the doors of the Columbus Convention CenterTesla Model 3 smashes through the doors of the Columbus Convention CenterTesla Model 3 smashes through the doors of the Columbus Convention CenterTesla Model 3 smashes through the doors of the Columbus Convention CenterTesla Model 3 smashes through the doors of the Columbus Convention CenterTesla Model 3 smashes through the doors of the Columbus Convention CenterA Model S crashed against a house in Memphis, Tennessee, in October 2019A Model S crashed against a house in Memphis, Tennessee, in October 2019A Model S crashed against a house in Memphis, Tennessee, in October 2019A Model S crashed against a house in Memphis, Tennessee, in October 2019A Model S crashed against a house in Memphis, Tennessee, in October 2019A Model S crashed against a house in Memphis, Tennessee, in October 2019Mahmood Hikmet Explains Autonomous Driving Basic Concepts in YouTube SeriesMahmood Hikmet Explains Autonomous Driving Basic Concepts in YouTube SeriesMahmood Hikmet Explains Autonomous Driving Basic Concepts in YouTube SeriesMahmood Hikmet Explains Autonomous Driving Basic Concepts in YouTube SeriesMahmood Hikmet Explains Autonomous Driving Basic Concepts in YouTube SeriesAshok Elluswamy said Autopilot avoids around 40 SUA events every day, but does it?
Costas Lakafossis is a respected accident investigator in his home country. He was in Greek TV shows to discuss the Tempi train crash that killed 57 people on February 28. When he petitioned National Highway Traffic Safety Administration (NHTSA) to recall all Tesla cars ever made, some tried to discredit him, but the safety regulator took him seriously. He may have explained why Teslas have so many Sudden Unintended Acceleration (SUA) cases.
Ironically, most headlines missed Lakafossis' true discovery. They focused on the Brake Transmission Shift Interlocks (BTSIs) the engineer suggested for fixing the problem. In an exclusive interview with autoevolution, Lakafossis compared it to "the proverbial egg of Columbus." According to the engineer, BTSI is "another way to solve the problem without forcing people to acknowledge it."

Lakafossis explained that the SUA events in Tesla vehicles do not happen due to hardware issues. The safeguards Tesla adopted – which made Jason Hughes state it was impossible for a Tesla to present such incidents – are indeed very effective. In an investigation of a Tesla Model Y crash in Athens, the engineer received an amazing insight from the person involved with it.

Tesla Model 3 smashes through the doors of the Columbus Convention Center
Photo: Columbus Dispatch via Youtube
This Tesla customer had his Model Y for around six months. Thanks to surveillance cameras, Lakafossis could check footage of what happened. The video presented the driver preparing to park the electric SUV. It moved slowly until it got close to its parking spot. You could see the brake lights flashing when it stopped. All of a sudden, the car just accelerates.

The engineer interviewed the Model Y owner with the help of the video to try to understand everything he did in each situation the images depicted. When Lakafossis reached the part where the brake lights turned on, saying the driver braked, the Tesla owner immediately corrected the engineer: there was "no need" for him to brake the car because it always did that on its own. This driver loved that his Model Y took care of braking in his place.

Tesla Model Y crashed in Chaozhou and killed two people\: what caused this\?
Photo: Janchubi/Weibo
After looking into similar cases, Lakafossis realized they all presented the same pattern: the Tesla BEVs were all about to park when they suddenly accelerated. That is pretty evident in all videos related to such crashes that I have also seen in the cases I covered and the embedded footage you can watch at the end of this story. These were crucial clues for the engineer to understand what was going on with SUA incidents involving Tesla.

Let's start with the obvious: you cannot see the pedals in which you step while driving. Some may think it is enough to know the accelerator pedal is on the right, and the brake pedal is on the left. That puts proprioception (or kinesthesia) as a central component of automatic pedal pressing. Feel capacity would also play a role, but most shoes do not allow you to feel the pedals. The problem is that there are cases where the drivers misapplied the pedals and accelerated their vehicles while they were sure they had stepped on the brakes.

Tesla Model 3 Crashes in Paris Allegedly Due to Sudden Unintended Acceleration
Photo: Paris Police
On December 11, 2021, a taxi driver getting back from a restaurant in Paris with his family hit 21 people when his Model 3 suddenly accelerated. One person died. His lawyer, Sarah Saldmann, included other people in the lawsuit against the carmaker, claiming that the problem is more frequent than it seems. The 59-year-old cab driver has more than 30 years of experience. Tesla said the car had no technical issues, meaning the Model 3 driver was to blame.

On November 5, 2022, a former truck driver tried to control his Model Y for 2.6 kilometers (1.6 miles) until a crash stopped it. Two people died on the way. Identified solely as Zhan, the then 55-year-old driver said he pressed the brake pedal, and it did not work. Tesla said he was pushing the accelerator pedal instead.

Tesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengage
Photo: Ton Aarts on Twitter
Either Tesla concealed that these experienced drivers stepped on the brakes, or they thought they did but didn't. Several drivers in China said they would put cameras to film the pedals because they did not trust Tesla was telling the truth. Thankfully, no SUA event has happened so far with anyone who took such precautions – we would have heard about it if it did. What if the drivers really stepped on the accelerator pedal instead of that in charge of braking, as Tesla claims they did?

To explain that, Lakafossis used concepts of neuroscience and Control Theory, such as closed-loop and open-loop controls. The latter does not depend on feedback. When it is used to describe human activities, it relates to things that you cannot adjust, such as throwing a ball into a basket or a dart on a board. After you execute them, there is no way to change them. The former is the opposite: any action depends on feedback, and correction is constant.

Tesla owner films frozen Model 3 ICE screen and jammed TACC he could not disengage
Photo: Ton Aarts on Twitter
When dealing with a closed-loop system, it depends on the feedback loop sequence to keep going. In terms of pedal application, it means that you need to keep moving your feet from the accelerator to the brake pedals with their respective responses of speeding up and slowing the car. If you interrupt that for any reason, there's an increased chance not only of you misapplying the pedals: you may also swear you have stepped on the right one.

In other words, if you are accelerating and have to park, you will naturally move your foot from the accelerator pedal to the brakes. If you expect your car to do that for you, you just remove your foot from the accelerator pedal and keep it over the floorpan. Lakafossis believes this is what Tesla vehicles induce their drivers to do. When the car fails to brake on its own, that may generate a pedal misapplication that the driver will swear was correct. Worse: they will keep pushing the accelerator pedal, willing for the car to stop.

Tesla Model 3 crashes against a wall in a parking garage due to sudden unintended acceleration
Photo: Ton Aarts on Twitter
The engineer stressed that this automatic braking process in Tesla vehicles has nothing to do with Automatic Emergency Braking (AEB). One-pedal driving also has no role in this. It is something related to Tesla programming that makes its BEVs behave like allegedly intelligent cars, anticipating what the driver would normally do. The engineer said Autopilot was responsible for that behavior, but he recognized it does not have to be activated for the vehicle to act like that. Tesla advocates will certainly use that to disqualify the engineer.

However, Ashok Elluswamy once tweeted that "Autopilot prevents 40 crashes/day where human drivers mistakenly press the accelerator at 100% instead of the brakes." He even presented a video of Autopilot allegedly saving someone's legs from being crushed. Hopefully, Tesla's Autopilot software director asked permission to share the images publicly from the people involved. After all, the BEV maker is being accused of sharing private videos from its customers without their knowledge. I'd ask Tesla if that was the case with Elluswamy's tweet, but the BEV maker does not talk to the press. It is also not related to the main topic of this story, only an interesting side note.

Ashok Elluswamy said Autopilot avoids around 40 SUA events every day, but does it\?
Photo: Tesla
Back to what matters here, the Autopilot director's tweet raised a warning sign for Lakafossis. What could sound like very positive news for Autopilot may actually be pretty detrimental. Just think about it: if Autopilot "prevents 40 crashes/day where human drivers mistakenly press the accelerator at 100% instead of the brakes," there are only two hypotheses in which that fits.

The first is that the SUA incidents Elluswamy said Autopilot prevents every day could happen with any car from any brand. If that is the case, the engineer argues we should hear about SUA crashes daily – at least 40 from each brand that does not have Autopilot to avoid them. We don't.

Tesla Model Y crashed in Chaozhou and killed two people\: what caused this\?
Photo: Janchubi/Weibo
The second is that these pedal misapplication cases involving Tesla vehicles are specific to the brand. The crashes we hear about are the fraction that Autopilot may have provoked but was unable to avoid. If only one among the 40 daily saves failed, we would have a Tesla SUA crash every single day. Lakafossis classifies that as an "obvious statistical anomaly." Tesla should care to explain that.

This is not the first time Elluswamy has been candid about issues involving the BEV maker. I've already written about how he said, under oath, that he was not aware of what defines an Operational Design Domain (ODD) and was also not familiar with perception-reaction time, two crucial concepts of autonomous driving research. It was on the same deposition in which the Autopilot director confirmed that a 2016 video claiming Autopilot drove a car with no disengagements was staged, but that's also another story.

Mahmood Hikmet Explains Autonomous Driving Basic Concepts in YouTube Series
Photo: Mahmood Hikmet
Regardless of what Tesla calls the braking programming that makes its cars sometimes automatically stop when parking, Lakafossis thinks it is the root cause of the Tesla SUA incidents. The engineer rented a Model Y in Athens to perform tests in December 2022. He confirmed the car did not behave in a linear way, with predictable responses in all situations. It would coast, slow down, or brake randomly. In February 2023, he installed a VBox in another Tesla and could see again that the vehicle presented different responses even when he drove the car around the same block multiple times.

"The actual cause of the problem is the stochastic (nonlinear) nature of automatic braking, or else, the way that a Tesla car decides to brake by itself in response to the road conditions in a non-standard timing and manner. This is what causes confusion, and this is what is very different in Tesla cars in comparison to standard ICE cars or one-pedal regenerative braking in a BEV."

A Model S crashed against a house in Memphis, Tennessee, in October 2019
Photo: Visanji T. Gala
After understanding the problem, Lakafossis had to solve two pressing questions: how to help fix that and who to ask for measures. BTSI was the best answer for the first one.

"Allowing reverse to be selected while moving forward is not the actual cause of the problem, but it is a quick and easy way to fix it – by forcing the driver to move their foot toward the brake pedal."

If Tesla adopts BTSI – voluntarily or with an NHTSA push – it creates a step in which putting your foot on the brakes is mandatory. That keeps the feedback loop sequence active and reduces the risk of pedal misapplication without forcing Tesla to review the entire Autopilot software to prevent its erratic behavior. Lakafossis thinks a creep mode would also help a lot but said that "this option is grey on the menu of recent cars, which means you cannot select it anymore."

Tesla Model Y clearly shows brake lights turned on without any obstacles ahead\: the driver stepped on the brakes
Photo: Janchubi/Weibo
Suppose Tesla vehicles moved just like an automatic ICE vehicle. You'd have to step on the brakes to start the car. Releasing the pedal would make the BEV start moving slowly, so drivers would always keep their feet there, controlling the speed. BTSI and creep mode as standard would probably fix that for good.

"The problem with unexplained SUA accidents is the absolute belief of a young, alert, and competent driver that he was pressing the brake pedal all along. That's what I realized in the minor accident I was asked to investigate, and this is what led me to seek answers in neuroscience and not in engineering."

A Model S crashed against a house in Memphis, Tennessee, in October 2019
Photo: Visanji T. Gala
Lakafossis recognizes that "this is a very difficult concept to explain, research and prove and also it is very difficult to quantify when asking for improvements." Hence his request to adopt BTSI, which is a standard feature in automatic vehicles. The engineer told me that not having it is not illegal, so Tesla products could be the way they currently are. The deal is that they did not need to be like this.

"What I don't understand is why Tesla made its cars this way. There is no gain in avoiding BTSI or adopting this random braking feature. It is a decision that puts marketing before safety and which has already cost four lives – at least that I am aware of. One article said I was a Greek engineer trying to make Tesla go bankrupt, but I have no interest in any of that or in the stock market. A simple over-the-air update and two lines of code can solve this once and for all! Would that bankrupt Tesla?"

A Model S crashed against a house in Memphis, Tennessee, in October 2019
Photo: Visanji T. Gala
To solve the second question his discovery brought up, Lakafossis said talking to the NHTSA seemed to be the most effective option he had.

"If I talked to the Greek transportation department, they would probably just check if everything with homologation was correct and deny a correction request. Discussing this with the European Union would probably lead to similar results. I tried to talk to the lawyer who is dealing with the Paris crash, but she said she had her own investigators and did not want to hear from me. It was then that I learned that anyone could write a petition to the NHTSA to ask for safety corrections in cars, so that's what I did."

Tesla Model Y crashed in Chaozhou and killed two people\: what caused this\?
Photo: Janchubi/Weibo
When I asked NHTSA about Lakafossis' petition, it replied that it "will carefully review the petition and relevant data. The agency's final decision will be posted in a closing resume, which will state if the agency is either opening an investigation or denying the petition. The final decision will be available on NHTSA's website and, if denied, also published in the Federal Register."

The engineer said he has also submitted his work to the Society of Automotive Engineers (SAE) for publication. He believes that it can help develop safer cars in the future by putting the behavioral matters involved with SUA under the spotlight.

This is how the Tesla gear selection lever is operated
Photo: Tesla/edited by autoevolution
I also seized the opportunity to ask Lakafossis about the most accepted hypothesis so far for these unintended acceleration cases: accidentally activating Traffic-Aware Cruise Control (TACC).

"The active cruise control recall does not explain the SUA accidents in question. It may prevent a totally different type of accident, but not these particular crashes. If you engage the cruise control by mistake, you will realize that something is not normal, and you will step on the brake pedal. If you have more than two seconds before you hit something, you will probably manage to brake in time."

According to Lakafossis, unless a feedback loop sequence is broken in accidentally activating TACC, there is no reason for a capable driver to misapply the pedals. With his theory, the engineer may have solved this mystery for good, with pieces of evidence that Tesla itself has given him. Even if it is not accepted, the measures he asked NHTSA to impose will not hurt anyone. On the contrary: saving lives is their main goal – and that deserves all credit.


If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Gustavo Henrique Ruffo
Gustavo Henrique Ruffo profile photo

Motoring writer since 1998, Gustavo wants to write relevant stories about cars and their shift to a sustainable future.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories