AI Needs to Be More Energy-EfficientNEWS | 27 March 2025Artificial Intelligence uses too much energy. Developers need to find better ways to power it or risk adding to the climate crisis
Artificial intelligence is everywhere: it’s designing new proteins, answering Internet search questions, even running barbecues. Investors are captivated by it—and so is the U.S. president. Just after taking office, President Donald Trump announced his support for Stargate, a company worth up to $500 billion, bankrolled by some of the biggest players in this space, to facilitate AI development in the U.S.
But the data centers and other infrastructure needed to develop and run the technology are incredible electricity hogs. And with Trump’s declaration of a “national energy emergency”—an undisguised ploy to increase fossil-fuel production—AI’s energy needs are poised to make climate change even worse. The technology is already responsible for massive greenhouse gas emissions that cause climate change. If Stargate and the many other companies developing AI platforms do not insist on cleaner and more efficient energy, they will only aid in the destruction of our planet.
This technology’s many flavors include the buzzy generative AI, the basis of ChatGPT and Google’s year-old search-answer system. During its operation, generative AI guzzles electricity in two stages, requiring warehouse-size data centers to house the necessary computing.
On supporting science journalism
If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
We all will have to pay when this exorbitant energy use inflates the cost of the kilowatt, regardless of our personal interaction with the technology.
Developers must first train the AI model on vast stores of data, which takes countless hours and requires enormous computing capabilities. Training one ChatGPT precursor consumed enough electricity to power 120 average U.S. homes for a year. Every time a model is upgraded, it must be retrained. The sudden release of the DeepSeek chatbot out of China—reportedly trained for a fraction of the price of ChatGPT and similar U.S. systems—may lead to less energy-intensive processes, but it’s too soon to know for sure.
And the demand doesn’t stop once a model is trained. Each query the AI receives requires it to consider everything it has been fed, then synthesize an answer from scratch in a process called inference, which also requires energy. Compared with search engines, text-generating systems can easily use 10 times as much energy to address a query, and sometimes they use dozens of times more. Image generation requires even more energy—as much as 50 percent of the amount needed to fully charge a smartphone, one study found.
Many analyses interpret this energy use for the training and large-scale operation of AI as an increased cost to the system’s owner. For example, one estimate suggests that if Google uses generative AI to produce 50 words of text per answer in response to just half of the queries it receives, it will cost the company some $6 billion.
But the truth is, we all will have to pay when this exorbitant energy use inflates the cost of the kilowatt, regardless of our personal interaction with the technology. The scale of consumption is simply too large, and as AI sneaks into ever more aspects of daily life, its energy use is projected to skyrocket. At the industry scale, it’s difficult to isolate AI from other computing demands, but data centers serve as a convenient proxy, given that the rise of the technology has led to their boom.
The numbers are staggering: In the mid-2010s U.S. data centers used about 60 terawatt-hours per year. (One terawatt-hour is the equivalent of one billion kilowatt-hours , the unit used to measure electricity consumption in most U.S. homes.) By 2023, a recent report from Lawrence Berkeley National Laboratory found, that number had nearly tripled to 176 terawatt-hours; demand is expected to rise to between 325 and 580 terawatt-hours by 2028. At that level, data-center energy use would potentially account for between 6 and 12 percent of total U.S. energy consumption, up from 4 percent in 2023.
Even as commercial energy demand continues to grow, people are already seeing higher residential energy prices in some regions where thirsty technologies such as AI are taxing the grid.
Amid this skyrocketing energy demand, work to decarbonize energy production is progressing too slowly both in the U.S. and globally. Climate change is already unfolding around us, worsening disasters ranging from the Los Angeles fires to Hurricane Helene to extreme heat and causing surprising and long-lasting consequences. Reducing the harm of climate change requires ending fossil-fuel use as quickly as possible. Sudden, huge demand from any industry makes that more difficult.
Sure, large technology companies could offer valuable resources to support the energy transition. The Stargate investment is expected to rely in part on solar power. Before leaving office, President Joe Biden opened public lands to data centers running on clean energy as a way to encourage its use for computing.
But because solar, wind and hydropower production rates can vary with weather and other factors, nuclear energy is particularly appealing to ever thirsty AI technology companies, raising fears of nuclear waste contamination. Most notably, Microsoft has a deal to restart the infamous Three Mile Island fission facility that was the site of the worst nuclear accident in the U.S. Meanwhile OpenAI CEO Sam Altman is throwing his support behind, among other things, nuclear fusion, a technology that looks unlikely to provide energy at any significant scale until 2050 at the earliest.
Even if AI companies lean heavily on clean power and don’t worsen the climate crisis, the technology’s seemingly insatiable need for energy remains concerning. And efficiency improvements, though vital, may not be enough. The so-called Jevons paradox, which posits that making a resource cheaper or more efficient can increase its use rather than shrinking its footprint, may be a factor. Wider highways invite more cars, and the Internet has led to doomscrolling as a time-consuming preoccupation that encourages more energy use.
While technology companies push AI, we need to push them for not just small innovations in efficiency but big ones that keep the energy footprint of the U.S. reined in. The alternative may be an AI-enabled barbecue that chars the world.Author: The Editors. Source