autoevolution
Car video reviews:
 

"Is Elon Musk Killing People?" - You Won't Like the Short Answer

Elon Musk is one of the moment's most controversial public personas, and considering we still have Trump and Kanye West around, that really says something.
Tesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and backTesla Autopilot FSD trip from San Francisco to Los Angeles and back
It's only natural for the many to look with incredulity at the ones sitting at the top of the world's richest people list, and even if Elon's position there has been fluctuating lately, he's only really varied between first and second with a fortune worth over $150 billion after his lightning-fast ascension last year. No need to start a Fund Me campaign for him, then.

However, his wealth is not what makes Elon Musk so controversial, and neither is how he got it. What gets people talking is the way he runs his most prominent business - Tesla Inc. A brand-new carmaker set on changing the status-quo of a century-old industry was always going to raise discussions, but for the past five years or so, the focus of these debates has switched from the vehicles' powertrains to their so-called self-driving abilities.

Ever since Autopilot came out (2014, if anyone cares), it's been the primary source of contention, and that's because it could potentially have repercussions that go beyond financial or material loss: people could lose their lives.

It took two short years from the introduction of Autopilot for Elon Musk to come out and claim the self-driving issue wasn't even a puzzle anymore. "I would consider autonomous driving to be basically a solved problem," he said during an interview in 2016, which happens to be the same year Joshua Brown lost his life in a crash while having Autopilot engaged.

What Elon Musk and Tesla were doing - and still are - is now called "Autonowashing." That's a portmanteau coined by Liza Dixon in a paper published under the name "Autonowashing: The Greenwashing of Vehicle Automation." It refers to something everyone with eyes and a functioning brain could have seen for years, which is the way Elon Musk is constantly building up the capabilities of Tesla's Autopilot (and later, Full Self-Driving suite) by every means available - from his discourse in interviews and Twitter to the actual naming of the two features.

"What's the problem with Autonowashing?" you might ask, and the best way to answer that can also be found in Liza Dixon's paper. She shows how Autonowashing leads to overtrust from the user, which leads to misuse (or abuse, if you want), which leads to accidents, which leads to negative media coverage, which leads to public distrust, which can ultimately have a negative impact on the whole technology, not just Tesla.

But Tesla doesn't limit itself to the voice of its CEO, Elon Musk. The company is manipulating public opinion through data as well. One very good example is how the safety of Autopilot is presented. They say, in the US, an accident occurs every 484,000 miles driven. By contrast, a Tesla with Autopilot active is safe for 4,190,000 miles, which is over eight times better than the current average.

However, what Tesla fails to state when presenting these numbers is that Autopilot only works on certain roads - mainly freeways - which are, by definition the safest roads out there. You can't use Autopilot in a parking lot, a place where bumping into another car is much more likely. You can, however, use the Smart Summon feature there, though you're better off not doing it.

One of the rules of humans interacting with automated systems is that people tend to get more complacent the better the system gets - it even has a name: automation complacency. In other words, if Autopilot is just good enough to give the user the sensation it can be used as a Level 5 system - something also supported by Elon Musk's rhetoric, even if not in a completely open manner - the human brain is wired to treat it as such.

How do you make sure that doesn't happen? Written warnings won't do and, as has been proven time and time again, neither will torque-based sensors in the steering wheel. Whether it was with toiled paper rolls or an orange, it didn't take people long to figure out how to trick those.

The obvious solution is using a driver monitoring system (DMS) but doing that is akin to admitting you only have a Level 2 autonomous system, at best. Elon Musk constantly said Level 5 was just around the corner (as early as 2016, as we mentioned earlier), so having one fitted to Teslas didn't exactly support his claims.

In 2019, Musk said monitoring the driver was soon going to be a moot point since "having a human intervene will decrease the safety." He saw DMS as equivalent to admitting Tesla was no (or not much) further ahead than everyone else. He saw it as admitting defeat. That's because, by that point, he had dug himself in a deep hole he thought he'd be out of very quickly, but instead, found himself slipping back inside over and over again because, as he recently found out, "generalized self-driving is a hard problem."

Tesla does have a camera-based DMS right now, but it's far from being in line with the hi-tech nature of the vehicle. The camera can easily be tricked to think the driver is looking at the road by sticking a photo in front of it, and come nighttime, it lacks an infrared light. The Plaid gets the latter, but the Plaid is a very expensive vehicle that doesn't make the bulk of Tesla's sales.

Besides, Tesla only recently started to roll out these DMS despite the fact the NTSB warning everyone that torque-based sensors in the steering wheel weren't a strict enough measure to ensure the driver was paying attention to the road. The e-mail went to several carmakers, out of which Tesla was the only one that didn't reply.

The video below is a long but very captivating watch that sets out to answer the question posed in its (and this article's) title: is Elon Musk effectively killing people? Its author, Mahmood Hikmet, isn't offering a yes or no answer, but as long as you can read between the lines, we think you'll figure out which of the two is closer to the moral truth.

"Autopilot deaths [...] didn't need to happen. They are a result of Elon's ego and drive which got in the way" is Mahmood's sad but ultimately true, if you look at the facts, conclusion. Now, another question emerges: was it worth it? Well, assuming for a second there are things more important than human lives, the only way you could build a case for a "yes" answer would be if Tesla had cracked autonomous driving by now. Has it?



 
 
 
 
 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories