Are forecasts of AI's energy needs all wrong?
Here’s a look at the most interesting AI+Energy content from the past week (or so...)
This week we’ll take a deep dive into a fantastic article on AI’s energy consumption, followed by a couple of buzz-worthy posts that were heavily circulated last week, and a podcast from the electric co-op perspective.
-JM
Rethinking Concerns About AI’s Energy Use
Daniel Castro provides a comprehensive look at the concerns around AI’s energy consumption. He suggests:
“Just as early predictions about the energy footprints of e-commerce and video streaming ultimately proved to be exaggerated, so too will the dire estimates about AI likely be wrong”.
and:
“There are measured steps policymakers can take to ensure AI is part of the solution, not part of the problem, when it comes to energy use and the environment.”
published: Jan 29, 2024 Source: Center for Data Innovation, PDF here
Jake’s take: This is a thorough and well-researched article targeting a policy-making audience, and should be considered a “must-read” for anyone tracking this space.
Let’s take a deeper look at some of the author’s points, starting with his 4 key recommendations:
Develop energy transparency standards for AI models.
Seek voluntary commitments on energy transparency for foundation models.
Consider the unintended consequences of AI regulations on energy use.
Use AI to decarbonize government operations.
The first 2 points are critical, in my view. A full stack of software tools, standardized metrics, and reporting mechanisms are required here. We need to build the systems that will allow the AI companies and data centers to evangelize their energy-saving efforts and to compete on carbon-reduction goals. I believe that a large segment of the AI market would choose the “greener” option, given similar cost and performance.
AI’s energy use is hard to measure
“Creating accurate estimates of the energy use and carbon emissions of AI systems over their lifetimes is challenging because these calculations depend on many complex factors, including details about the chips, cooling systems, data center design, software, workload, and energy sources used for electricity generation”
It is a very difficult challenge, and one that needs much more development. Simply measuring the electricity consumption at the server (or even, GPU) level doesn’t tell the whole story. I do believe we can get “close enough” by using models to infer energy consumption based on compute time, total FLOPS, etc... But, even then, we need to understand the dynamic carbon-mix of the energy provided throughout the compute job to get a real understanding of the true carbon costs.
I see a push to compare a sort of model-size/energy consumption ratio: “Despite GLaM being nearly 7 times larger than GPT-3 and outperforming the other AI model, GLaM required 2.8 times less energy to train”. I think we should resist this as an oversimplification; not all models are trying to achieve the same outcomes, and there may be a multitude of factors other than the size of the model which determine its ultimate energy consumption.
I found the following table to be very informative, and I have not seen these data assembled so clearly anywhere else:
The major trends that I notice are the increase in parameters (>10x from GPT-3 to 4) and number of chips (2.5x from GPT-3 to 4).
“AI’s Energy Footprint Ignores Substitution Effects”
I love this point! While we’re often focused on the energy consumption of AI, hopefully balanced by the direct energy and carbon savings provided by the innovations it unlocks, we often overlook the additional energy/carbon savings of using AI for non-energy related everday tasks. Mr. Castro quotes research that found: “AI writing a page of text emits 130 to 1,500 times less CO2e than a human doing so” and “AI creating an image emits 310 to 2,900 times less.” It’s easy for me to imagine a significant impact here, though quantifying that impact will be a challenge.
“Develop Energy Transparency Standards for AI Models”
Mr. Castro is focused on a policy-making audience here, though I think everyone in the industry would also benefit from his recommendation to “support the development of energy transparency standards for AI models, both for training and inference”. I appreciate that he specifies training AND inference, as I believe we tend to undervalue the workloads associated with inference after the training is complete.
In summary, I agree with Mr. Castro’s major points, and we should be skeptical of the headline-grabbing projections (see below). That said, Mr. Castro’s analysis does not seem to account for the massive expected growth in GPU cluster size; Meta has announced plans to build a cluster of 600k GPUs (up from 16K in 2022), while OpenAI is rumored to be building up to 1M GPUs (up from the 25k it used for GPT-4). The other major AI players will likely need to keep up; thus, while some of the fantastic estimates on energy consumption we’ve seen are likely overestimating the energy per chip, they’re probably also underestimating the number of chips to be deployed.
AI Needs So Much Power That Old Coal Plants Are Sticking Around
Power companies are scrambling to satisfy the needs of data centers and new factories in a country where the grid is already strained.
“To cope with the surge, some power companies are reconsidering plans to mothball plants that burn fossil fuels”
“Soaring electricity demand is slowing the closure of coal plants elsewhere. Almost two dozen facilities from Kentucky to North Dakota that were set to retire between 2022 and 2028 have been delayed, according to America’s Power, a coal-power trade group”
Published: Jan 25, 2024 Source: Bloomberg (Paywall warning!)
Jake’s take: Apologies for the paywalled source here, but this one was all over my feeds last week so I felt it deserved attention.
Though the estimates of AI’s future energy demands may be often inflated, as stated in the previous article, this article suggests that very real energy constraints are affecting the AI industry today. There’s some conflation of issues here (is the problem AI, EVs, or green manufacturing? All of the above?), but I certainly agree that we’ll likely see a future where “we might have to delay the timing at which new large loads are added (to the grid)”.
If you’re ok with Bloomberg’s paywall, also check out:
NextEra Pushes Renewable Plants to Fuel Explosive Demand From AI
Blackstone Is Building a $25 Billion Empire of Power-Hungry Data Centers
What the AI Revolution Means for Electric Co-ops
How are co-ops already using AI, and how will this evolving technology shape and change the way we approach operations, member services and our workplace culture in the years to come?
“there's going to be a kind of a merger between these smart energy networks…because there's no point having a very sophisticated data center running these AI models if your energy networks are not stable and able to support it” (07:47)
“The next set of villains are going to be big tech because people are starting to realize that the energy use and the carbon impact of people being on Chat-GPT 24 -7 is going to be significant.” (10:16)
Published: Jan 16, 2024 Source: NRECA
Jake’s take: Guest speaker Mike Walsh gives some good background on AI but only very generalized suggestions of specific applications for electric co-ops. I’m not sure there’s anything in here at all that’s specific to co-op utilities. He does pose the question, “what does an AI -powered utility of the 21st century look like if you are planning it from scratch?”, but doesn’t offer any specific suggestions.
All in all, not a bad generalist view of the transformitive nature of AI, but very light on utility-industry specifics.
In case you missed it- here are your favorite links from previous editions of EnergyNews.ai:
Why does AI need so much energy?
Armand Ruiz, Director of AI at IBM, dives into the drivers of AI’s energy consumption.
Source: newsletter.nocode.ai
Why AI and energy are the new power couple
AI is increasingly vital in managing complex, data-rich power systems, especially with the rise of renewable energy sources. This IEA.org article highlights AI's role in improving the predictability and efficiency of power supply and demand, particularly in renewable energy. It also delves into AI's contribution to predictive maintenance, ensuring more robust and reliable energy infrastructures.
source: iea.org