AI already uses as much energy as a small country. It’s only the beginning.

This article was written by Brian Calvert for Vox.

In January, the International Energy Agency (IEA) issued its forecast for global energy use over the next two years. Included for the first time were projections for electricity consumption associated with data centers, cryptocurrency, and artificial intelligence

The IEA estimates that, added together, this usage represented almost 2 percent of global energy demand in 2022 — and that demand for these uses could double by 2026, which would make it roughly equal to the amount of electricity used by the entire country of Japan.

We live in the digital age, where many of the processes that guide our lives are hidden from us inside computer code. We are watched by machines behind the scenes that bill us when we cross toll bridges, guide us across the internet, and deliver us music we didn’t even know we wanted. All of this takes material to build and run — plastics, metals, wiring, water — and all of that comes with costs. Those costs require trade-offs. 

None of these trade-offs is as important as in energy. As the world heats up toward increasingly dangerous temperatures, we need to conserve as much energy as we can get to lower the amount of climate-heating gases we put into the air. 

That’s why the IEA’s numbers are so important, and why we need to demand more transparency and greener AI going forward. And it’s why right now we need to be conscientious consumers of new technologies, understanding that every bit of data we use, save, or generate has a real-world cost. 

One of the areas with the fastest-growing demand for energy is the form of machine learning called generative AI, which requires a lot of energy for training and a lot of energy for producing answers to queries. Training a large language model like OpenAI’s GPT-3, for example, usesnearly 1,300 megawatt-hours (MWh) of electricity, the annual consumption of about 130 US homes. According to the IEA, a single Google search takes 0.3 watt-hours of electricity, while a ChatGPT request takes 2.9 watt-hours. (An incandescent light bulb draws an average of 60 watt-hours of juice.) If ChatGPT were integrated into the 9 billion searches done each day, the IEA says, the electricity demand would increase by 10 terawatt-hours a year — the amount consumed by about 1.5 million European Union residents. 

I recently spoke with Sasha Luccioni, lead climate researcher at an AI company called Hugging Face, which provides an open-source online platform for the machine learning community that supports the collaborative, ethical use of artificial intelligence. Luccioni has researched AI for more than a decade, and she understands how data storage and machine learning contribute to climate change and energy consumption — and are set to contribute even more in the future. 

I asked her what any of us can do to be better consumers of this ravenous technology. This conversation has been edited for length and clarity.

Please click on this link to read the original article.

Image credit: Image by Freepik

Your account