The last time Elon Musk tweeted about issues with FSD, it started with Tesla pulling back FSD 10.3 and ended with the first official recall involving the beta software. Well, Elon Musk tweeted that FSD 10.4 presented “late-breaking issues” that will postpone releasing it. However, it may be on the streets as soon as November 7.
This is only raising again the question of how extensively Tesla tests its beta software before deploying it to regular customers (very little), and how it is just part of the problem. These episodes also make people question if Tesla’s method of releasing a new version of FSD every Friday is correct.
This has to do with adding new FSD “testers” to a group that already includes over 11,000 cars. Some of them are trying to turn left in front of oncoming traffic, which may eventually lead to serious head-on collisions. Youtube videos have revealed more than one situation in which that has happened so far, which prompted AV safety specialist Philip Koopman to ask NHTSA to prevent Tesla from testing on public roads.
It is very likely that FSD 10.4 will add new users to that group. We lost count of the ones that should be included now, but it is probably for the drivers with a 98 rating on Safety Score, another beta software Tesla uses to grant access to FSD.
The main difference with FSD 10.4 is that Tesla does not seem to have released it to regular customers this time. That may prevent a new recall if late-breaking issues do not emerge after the software is installed in customers' cars. If that happens, Tesla will have to pull back this new update, as it did with 10.3. In the previous process, Tesla turned off FCW (front collision warning) and AEB (automatic emergency braking) on affected vehicles without warning the owners about that.
If FSD 10.4 already presented “late-breaking issues,” shouldn’t Tesla just perform more extensive testing to ensure this software does not pose any safety risk for Tesla customers and those around them? Why the rush in releasing something that already presented problems before even hitting the roads? How many more cars will get this “late-breaking issues” beta software? We wish all people involved luck: they’ll need it.
This has to do with adding new FSD “testers” to a group that already includes over 11,000 cars. Some of them are trying to turn left in front of oncoming traffic, which may eventually lead to serious head-on collisions. Youtube videos have revealed more than one situation in which that has happened so far, which prompted AV safety specialist Philip Koopman to ask NHTSA to prevent Tesla from testing on public roads.
It is very likely that FSD 10.4 will add new users to that group. We lost count of the ones that should be included now, but it is probably for the drivers with a 98 rating on Safety Score, another beta software Tesla uses to grant access to FSD.
The main difference with FSD 10.4 is that Tesla does not seem to have released it to regular customers this time. That may prevent a new recall if late-breaking issues do not emerge after the software is installed in customers' cars. If that happens, Tesla will have to pull back this new update, as it did with 10.3. In the previous process, Tesla turned off FCW (front collision warning) and AEB (automatic emergency braking) on affected vehicles without warning the owners about that.
If FSD 10.4 already presented “late-breaking issues,” shouldn’t Tesla just perform more extensive testing to ensure this software does not pose any safety risk for Tesla customers and those around them? Why the rush in releasing something that already presented problems before even hitting the roads? How many more cars will get this “late-breaking issues” beta software? We wish all people involved luck: they’ll need it.
Some late-breaking issues with 10.4. We’re deploying a patch to internal beta vehicles around 3am tomorrow.
— Elon Musk (@elonmusk) November 6, 2021
If that goes well, we may be able to release 10.4 to external beta vehicles on Sunday. Many good improvements.
"There have been no FSD crashes" will be true ... right up until the fatality when it is no longer true.
— Philip Koopman (@PhilKoopman) October 31, 2021
Does anyone not remember Uber ATG test operations making the same argument right up until they killed a pedestrian? @NHTSAgov
h/t @samabuelsamid https://t.co/BCfwNaV4gE