We all understand that FSD (Full Self-Driving) is only available in beta. Therefore, the driver must always be in control of the vehicle. However, a recent video uploaded on Reddit gave worrying results after a Tesla owner tested his unit with a fake child (dummy) while it was on FSD beta mode.
"So, I am here with my daughter, and we are going to put it to the test to see if the car really can avoid a collision. And as you can see here, we've got a dummy child," the owner said.
Compared with most tests with children (mostly on the side of the road), this test had a real-life aspect. The dummy was suspended on a string and slowly pulled across the street from one side to the other to imitate a crossing child.
The test had shocking results. The car failed to spot the dummy. Tesla's visualization didn't pick up on any human crossing the road and went right through it.
On a separate video posted on Twitter using a cardboard dummy. The Tesla used detected the fake child (dummy) and slowed down with a slight turn to avoid a collision.
The sheer number of complaints cropping isn't making things any easier for the EV pioneer. Two complaints filed by the California DMV accuse the automaker of misleading its consumers about its autonomous features.
Redditors were equally shocked at the footage, with most of them turning the horrendous discovery into a light moment.
"I mean, Tesla is all about saving the environment, and humans are the biggest threat to the environment. Hmmm." One commenter said. "To be fair, if I see a ghost child floating through the air, I ain't stopping either," another commenter joked.
Destroyed @RealDanODowd PART 2:
— tesladriverperson (@tesladriver2022) August 10, 2022
1) Many asked the “child” be to the right more 2) In motion 3) Running into the street. I can’t get my @tesla to kill kids, that’s a good thing, but many seem disappointed?????????? @WholeMarsBlog @elonmusk @JohnnaCrider1 @28delayslater @chazman #FSDBeta https://t.co/5ufiPqcrgE pic.twitter.com/E2vmejiRe7