autoevolution
 

Study: Computers Found in One Billion Self-Driving Cars Will Have a Huge Carbon Footprint

MIT Study Self Driving Cars Emissions 6 photos
Photo: CarGurus
MIT Study Self Driving Cars EmissionsMIT Study Self Driving Cars EmissionsMIT Study Self Driving Cars EmissionsMIT Study Self Driving Cars EmissionsMIT Study Self Driving Cars Emissions
Carmakers, tech companies, and startups are pouring billions into self-driving technology. It’s very promising, but there are still many obstacles to pass. One of them is the carbon footprint of autonomous cars’ computers.
Today, information and communications technology account for more than 2% of global energy demand. Two percent doesn’t seem much, but MIT researchers did the math based on how this energy demand increased year after year.

As a result, they predicted that, if the trajectory doesn’t change by 2040, the information and communications sector will hit the world energy production capacity. And that’s unthinkable.

Moore’s Law is bad news for energy consumption

Computing performance improves at a fast pace, but the hunger for energy grows at the same time. However, hardware efficiency has a much slower rate of improvement. So, basically, it can’t keep up with the energy consumption required by more performant algorithms and software.

Think of it this way: those 2% of global energy demand is on par with the quantity of emissions from all the airplanes in the world. Now it really looks like a huge problem, huh? And it must be addressed accordingly, MIT researchers suggest.

In 1965, Gordon E. Moore, co-founder, and ex-CEO of Intel, stated that the number of transistors on a microchip doubles about every two years. He also asserted that the growth of microprocessors is exponential.

He made these statements based on his observations at the chip manufacturing plant Fairchild Semiconductor. His insight became a prediction later on, and Moore’s law turned into a golden rule in the semiconductor industry.

MIT Study Self Driving Cars Emissions
Photo: Togg
But now, almost half a century later, experts agree that computers will reach the physical limits of Moore’s Law at some point in the 2020s. The chip crisis is just a sign of how close that limit is. In the meantime, quantum computer development is still dragging.

We’re now at the point where computing power will be too much for current computer technology. So, computing will need more computers, which means more energy consumption. How much? Too much to cover in this article, so we’ll stick to self-driving cars subject.

A global autonomous fleet and its dilemmas

All the digital information in the world requires data centers. And today, the global network of data centers has a large carbon footprint. International Energy Agency estimates it to be around 0.3 percent of global GHG emissions – or as much as Argentina’s carbon footprint (it’s unrelated to their recent triumph in FIFA World Cup Qatar 2022).

Why do they use so much energy? For instance, Facebook’s data centers worldwide make a few trillion inferences processes each day. Just for fun, try to count to one trillion (which is one thousand billion, by the way). You’ll need roughly 30,000 years, give or take.

Now that we have put things into perspective – sort of… – let’s see the problem with autonomous vehicles that MIT researchers are so concerned about.

Their premise was that a self-driving car needs ten deep neural networks processing images from ten cameras. Just think about the upgrade your brain would need if you had ten pairs of eyes. Elon’s idea to plant chips in the brains makes sense if you look at it that way, doesn’t it?

Back to our self-driving cars. In one hour of driving daily, those ten deep neural networks make about 21.6 million inferences each day. Not quite easy-peasy for today’s computers, but manageable. Or isn't it?

MIT Study Self Driving Cars Emissions
Photo: MIT
A fleet of one billion autonomous vehicles would make 21.6 quadrillions of inferences each day. Again, just for fun, try to count to one quadrillion (which, by the way, is one thousand trillion). You’ll need about 31 million years. Dinosaurs went extinct about 65 million years ago – just to put things into perspective.

In short, MIT researchers determined that power computing of an autonomous vehicle must use less than 1.2 kW. This is to say that, by 2050, hardware efficiency would need to double every 1.1 years for the emissions of one billion self-driving cars fleet to stay under current levels of data centers.

That’s double what Moore’s Law showed is possible. At this point, the business-as-usual trends in decarbonization and hardware efficiency improvements sound like “Houston, we’ve got an enormous problem” in 2050.

How to avoid using tons of computing power

The simple answer is to design from autonomous vehicles more efficiently from the start. Easier said than done.

One solution is to design more specialized hardware to run specific driving algorithms for specific tasks. This is economically challenging because, instead of fewer multitasking platforms, this solution requires multiple different and customized hardware equipment.

There’s also an opportunity for more efficient algorithms, so they use less computing power. The challenge here is to avoid trading off accuracy in favor of efficiency, which could negatively influence vehicle safety – which still is the hottest topic of self-driving car issues. Unless you’re Tesla and settle it with “you've got to crack a few eggs to make an omelet.

MIT study is one possible explanation for the recent Argo AI fiasco. After investing $2.7 billion in the Robo-taxi startup it jointly backed with Volkswagen, Ford’s CEO said in a statement that "Profitable, fully autonomous vehicles at scale are a long way off."

Other self-driving projects are dragging, and even Tesla’s Autopilot improvements and updates still seem far away from a Level 4 ADAS capability. Of course, it doesn’t necessarily mean it’s impossible. But for now, it doesn’t seem profitable in the long run – as far as Ford is concerned.

MIT Study Self Driving Cars Emissions
Photo: WAE Technologies
MIT researchers don’t stop at computational power issues. They also noted that quantifying autonomous vehicle emissions must take into account the sensors’ electricity consumption. Which, by initial estimates, is almost the same level as power computing’s emissions.

Then there is the debate on how the self-driving car will shape the transportation sector. No matter whether they are used for moving goods or people, there is a growing consensus that autonomous vehicles will travel more than today’s cars.

So it’s possible that computing power and the related carbon footprint will be greater than what model simulations showed in the MIT study. It really means that this problem should be prioritized on people’s radar.
If you liked the article, please follow us:  Google News icon Google News Youtube Instagram
About the author: Oraan Marc
Oraan Marc profile photo

After graduating college with an automotive degree, Oraan went for a journalism career. 15 years went by and another switch turned him from a petrolhead into an electrohead, so watch his profile for insight into green tech, EVs of all kinds and alternative propulsion systems.
Full profile

 

Would you like AUTOEVOLUTION to send you notifications?

You will only receive our top stories