It was a wild first half of August for Tesla, its customers, investors, and pedestrians. It has now concluded with the removal of a debatable video that proved the carmaker’s Full Self-Driving (FSD) Beta advanced driver-assistance system (ADAS) works in several scenarios. Here’s everything you need to know.
Almost a fortnight ago, people everywhere were being informed that Tesla’s FSD doesn’t bring the car to an immediate stop when confronted with a dummy in the shape of a kid. That, of course, attracted a lot of attention. Consequently, investors, customers, and fans of the American brand were angry and confused. Why is this news coming from someone that wants FSD banned and tried to make a political career out of it? It even attracted Elon Musk’s dissatisfaction, which was expressed on Twitter.
The internet knows how to bank on such occasions, so, naturally, someone decided to upload different footage of the same thing on Twitter. The only problem is that recording was from January 2022. More specifically, it was a Luminar-organized event at CES. This company makes LiDAR-based products and software. It wanted to prove that Tesla giving up on radars was a mistake.
Both videos, however, showed that FSD Beta reliant on Tesla Vision – the radar-less ADAS – isn’t working as well as a vehicle with LiDAR. Even though data was unclear and nobody could certify that Tesla Model 3 and Model Y vehicles would fail to stop for kids every single time, people on various platforms started to be angry or disappointed by the EV manufacturer.
After many heated replies on Twitter and other forums, a couple of Tesla fans and buyers of its stock decided to make a test of their own. The first tests were done on dummies shaped like dogs and kids, which were more of a hit-and-miss. To put it mildly, no footage could’ve been used to draw a conclusion from it. But a certain investor known for his fiery approach on Tesla-related issues asked someone to volunteer their kid for a similar test. That’s when the issue started to gain traction and spread like wildfire.
However, what it did not prove is if a Tesla Vision-equipped Model 3 or Model Y can stop immediately when an unexpected obstacle (be it a child or an animal) appears out of nowhere. In the Luminar-sponsored test, the LiDAR-equipped Lexus braked hard every time. This tells us one important thing – the issue at hand is not only the preparedness of the ADAS but also the instantaneous activation of the emergency braking system. This problem was completely ignored by everyone. People were focused on FSD Beta, which relies on Tesla Vision.
This whole thing even got to the NHTSA. They had to recommend, admittedly too late, to people that they shouldn’t put someone’s life in danger to verify the performance of vehicle technology.
Now, after public backlash, confusion, and heated debates, the issue is also being judged by YouTube. The trial just ended, and the sentence was passed – the Tesla investor known as Omar Qazi is in the wrong. His video was deleted.
The blogger appealed and is awaiting a final response from the video-sharing platform. He also uploaded the edited recording on his website and said, “here’s the video they don’t want you to see.”
Apparently, YouTube didn’t randomly decide to delete his video. They were reportedly notified by Lora Kolodny – a CNBC reporter that focuses on Tesla and climate tech. She confirmed on Twitter that YouTube responded to an inquiry related to the Tesla FSD testing done by the blogger and the volunteer parent. The company told her it “doesn’t allow content showing a minor participating in dangerous activities or encouraging minors to do dangerous activities.”
People lost focus of what really matters
The bottom line is Tesla’s Enhanced Autopilot and Full Self-Driving Beta remain unfinished technologies that are being constantly updated. Nobody promised that once the $12,000 FSD is finished, it would be able to help vehicles navigate autonomously everywhere and work on any car. There’s still a long way to go.
Elon Musk’s company has the upper hand thanks to all the data it collects from users, but the manufacturer is years away from releasing something that even resembles the Drive Pilot announced by Mercedes-Benz. Even reaching Waymo’s current Level 4 of autonomous taxi driving is somewhat unfeasible medium-term.
Level 5 – where the car has full control, and no human intervention is ever required – might be decades away from materializing. Manufacturers will have to convince insurers to get coverage because their vehicles could be at fault in case of a crash, and they would be liable, while efforts must also be directed towards politicians who should accept to change laws and update a lot of regulations.
The real question fans, customers, investors, and employees should ask is - can a radar-less Tesla Model 3 and Model Y make proper use of the emergency braking system? That's what needs a clear and honest answer now.
The internet knows how to bank on such occasions, so, naturally, someone decided to upload different footage of the same thing on Twitter. The only problem is that recording was from January 2022. More specifically, it was a Luminar-organized event at CES. This company makes LiDAR-based products and software. It wanted to prove that Tesla giving up on radars was a mistake.
Both videos, however, showed that FSD Beta reliant on Tesla Vision – the radar-less ADAS – isn’t working as well as a vehicle with LiDAR. Even though data was unclear and nobody could certify that Tesla Model 3 and Model Y vehicles would fail to stop for kids every single time, people on various platforms started to be angry or disappointed by the EV manufacturer.
After many heated replies on Twitter and other forums, a couple of Tesla fans and buyers of its stock decided to make a test of their own. The first tests were done on dummies shaped like dogs and kids, which were more of a hit-and-miss. To put it mildly, no footage could’ve been used to draw a conclusion from it. But a certain investor known for his fiery approach on Tesla-related issues asked someone to volunteer their kid for a similar test. That’s when the issue started to gain traction and spread like wildfire.
News gets around pretty fast these days
Eventually, someone said their kid would do it. The test happened in a random U.S. neighborhood, and it was a success – FSD does see small children crossing the road when traveling speeds are low.However, what it did not prove is if a Tesla Vision-equipped Model 3 or Model Y can stop immediately when an unexpected obstacle (be it a child or an animal) appears out of nowhere. In the Luminar-sponsored test, the LiDAR-equipped Lexus braked hard every time. This tells us one important thing – the issue at hand is not only the preparedness of the ADAS but also the instantaneous activation of the emergency braking system. This problem was completely ignored by everyone. People were focused on FSD Beta, which relies on Tesla Vision.
This whole thing even got to the NHTSA. They had to recommend, admittedly too late, to people that they shouldn’t put someone’s life in danger to verify the performance of vehicle technology.
Now, after public backlash, confusion, and heated debates, the issue is also being judged by YouTube. The trial just ended, and the sentence was passed – the Tesla investor known as Omar Qazi is in the wrong. His video was deleted.
The blogger appealed and is awaiting a final response from the video-sharing platform. He also uploaded the edited recording on his website and said, “here’s the video they don’t want you to see.”
Apparently, YouTube didn’t randomly decide to delete his video. They were reportedly notified by Lora Kolodny – a CNBC reporter that focuses on Tesla and climate tech. She confirmed on Twitter that YouTube responded to an inquiry related to the Tesla FSD testing done by the blogger and the volunteer parent. The company told her it “doesn’t allow content showing a minor participating in dangerous activities or encouraging minors to do dangerous activities.”
People lost focus of what really matters
The bottom line is Tesla’s Enhanced Autopilot and Full Self-Driving Beta remain unfinished technologies that are being constantly updated. Nobody promised that once the $12,000 FSD is finished, it would be able to help vehicles navigate autonomously everywhere and work on any car. There’s still a long way to go.
Elon Musk’s company has the upper hand thanks to all the data it collects from users, but the manufacturer is years away from releasing something that even resembles the Drive Pilot announced by Mercedes-Benz. Even reaching Waymo’s current Level 4 of autonomous taxi driving is somewhat unfeasible medium-term.
Level 5 – where the car has full control, and no human intervention is ever required – might be decades away from materializing. Manufacturers will have to convince insurers to get coverage because their vehicles could be at fault in case of a crash, and they would be liable, while efforts must also be directed towards politicians who should accept to change laws and update a lot of regulations.
The real question fans, customers, investors, and employees should ask is - can a radar-less Tesla Model 3 and Model Y make proper use of the emergency braking system? That's what needs a clear and honest answer now.
Here’s the video they don’t want you to seehttps://t.co/KdzYlsKPGh
— Whole Mars Catalog (@WholeMarsBlog) August 20, 2022
“YouTube doesn’t allow content showing a minor participating in dangerous activities or encouraging minors to do dangerous activities. Upon review, we determined that the videos...violate our harmful and dangerous policies, and as a result we removed the content," the co. said.
— Lora Kolodny (@lorakolodny) August 20, 2022