autoevolution
 

Tesla's Full-Self Driving v10.69 Update Explained

Hot of the press via Twitter user @ACPixel, we finally get to see the early access release notes from the Tesla FSD (Full Self-Driving) v10.69 update. Or as the tablet “Software update” screen shows, the 2022.16.3.10 version. Let’s see what they actually mean.
Tesla FSD beta update 10.69 16 photos
Photo: Screenshot from YouTube Channel Whole Mars Catalog
FSD Beta Software Update  2022.16.3.10Tesla FSD beta v10.69 Release NotesTesla FSD beta v10.69 Release NotesTesla FSD beta v10.69 Release NotesTesla FSD beta v10.69 Release NotesTesla FSD beta v10.69 Release NotesTesla Model S Trying to AutoparkAnxious Tesla DriverConfused Tesla DriverTesla Model S Trying to AutoparkAudi e-tron GT Parallel ParkedBMW i4 Parallel ParkedFord Mustang Mach-E Parallel ParkedTesla Model 3 Trying to AutoparkTesla Model S with Autopilot 1 Successfully Parked
Keep in mind that it’s a closed beta, which means it’s off-limits for the general public. It's out only for a select few "guinea pigs" to see what’s what and iron out the kinks before it completely goes live.

Among the features, we find that Tesla has added a “deep lane guidance” system that merges what the car "sees” with its cameras and sensors with the coarse map data. They claim this will lower the error rate on lane topology by 44%. The car will allegedly see the geometry of the lanes better, improving the self-driving experience.

Of course, among the many update notes from the Egyptian-long scroll, there are some vague ones like “improved overall driving smoothness, without sacrificing latency, through better modeling of system and actuation latency in trajectory planning.” Usually, for marketing purposes, similar non-descriptive terms make the features sound more impactful than the end result probably is.

Making unprotected left turns in high-speed cross traffic has been improved by optimizing the initial car jerk to mimic the sudden pedal press of a real driver when the car would approach or exit a median crossover region. All the while it better scans and predicts other high-speed moving vehicles.

Now, whether or not you should let the car decide to do so in an intersection with other vehicles going 60 mph (97 kph) is a whole other topic.

Also updated is the recognition of non-specific three-dimensional (real-life) objects that don’t represent humans, cars, or other recognizable things the software can detect. “UFOs” they call’em.

Basically, it better picks up things like mailboxes, floating balloons, and other objects traveling at low speeds. And it sees objects in the distance like crossing vehicles with 26% fewer missing detections.

It can better tell the difference between objects on the ground and the ground itself by using video-based scanning instead of regular image-based scanning. This way it lowers the rate at which it confuses regular stuff on the road with the road itself.

Next, it can predict the trajectory of moving objects more precisely by calculating their yaw rate together with lateral movement. Simply put, when a car or person turns left or right in front of the Tesla in intersections or when another vehicle cuts into your lane. Or your Tesla's lane, maybe the Tesla will now get 14% more upset if you take credit for its driving.

Distant crossing vehicle speed estimates have been improved by 20% while using just 10% of the required computing power. This decrease in power consumption for something that’s happening almost constantly (scanning for far away crossing vehicles) is indeed something praiseworthy. It saves battery, which in turn produces less heat, thus less cooling is required.

This cutback in computing power might not sound like such a big deal, but when you think of every little thing that drains the battery on a long road trip, like multiple phones charging at the same time, the air conditioning turned to 11 and the old “stereo” blasting the greatest Journey hits, they add up.

Moving on, protected right turn smoothness has increased by better associating traffic lights with slip lanes versus yield signs with slip lanes. This would reduce needless car slowdowns in the absence of yield signs or traffic lights. On the opposite side, the car will slow down when it will detect these traffic elements.

Somewhat related to that is the improved creeping visibility in the presence or absence of traffic control elements at any intersection where objects might cross the car’s current lane path. It will have more “guts” and enter highways faster than before by better reading the speed changes on the map.

This next one is a bit of a doozy. The update reduced false slowdowns near crosswalks by better “understanding” pedestrian and bicyclist intent based on their motion.

In my honest opinion, this might work if road traffic was completely autonomous, and we lived in a Wall-E-type of future. But human psychology is unpredictable, and not even humans could 100% predict pedestrian intent every single time.

Especially when the pedestrian’s neurons could fire the signal to cross the street at the very last second, in a very dangerous manner. Humans can’t fully predict other humans, let alone an extremely young technology that hasn't even become the globally accepted standard yet.

I’m not trying to knock the feature down or anything, I'm sure it can account for more parameters than I can count. I'm just offering my two cents on the matter of “predicting people’s crossing intent” with math equations, even though it can read the speed of pedestrians and bicyclists with 17% more accuracy now.

They updated the geometry error from the “ego-relevant lanes”, or simply put, the lane you’re driving on, by 34%. Basically, it accounts for poorly visible lanes and keeps the course smooth and steady. It also reads the crossing lanes better by 21%.

In well-developed cities like L.A., Las Vegas, and Austin where autonomous vehicles are tested, the roads should theoretically be well marked, and the software shouldn’t have such a hard time telling apart what’s what. But improvements can truly be observed only where the infrastructure isn’t as sound.

The creep feature hits the breaks much smoother now when it senses objects on its path. Animals will be recognized 34% better. Stopping position accuracy is higher now in dangerous scenarios with crossing objects. Forking lanes are better recognized by 36%.

When starting from a full stop it will take into account the jerk from the car in front of you. Meaning the lead car will be jerking. The previous phrasing didn’t mean to assume the guy in front is automatically a jerk. Innocent until proven guilty, I say.

The last item on the list concerns other drivers' behavior at traffic lights. It can now tell faster if the car in front of you can break in time before the lights turn red or if it will floor it and run the lights. This part could also be related to my earlier joke.

With the beta update barely even out, Elon Musk has already tweeted that version 10.69.1 could be out by the end of next week, followed by the 10.69.2 update further down the line for all current beta parcitipants.

To the best of my knowledge, I covered everything. I wouldn't assume how anyone else would do, but I know I wouldn't trust an infancy-stage software better than my eyes and ears when it boils down to my and other people's safety. Drive safe y'all!

If you liked the article, please follow us:  Google News icon Google News Youtube Instagram X (Twitter)
About the author: Codrin Spiridon
Codrin Spiridon profile photo

Codrin just loves American classics, from the 1940s and ‘50s, all the way to the muscle cars of the '60s and '70s. In his perfect world, we'll still see Hudsons and Road Runners roaming the streets for years to come (even in EV form, if that's what it takes to keep the aesthetic alive).
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories