AI Burns All the Energy Will AI’s growing power demands drain the grid?

Published
Reading time
2 min read
Person sitting by a campfire holding a stick, looking worried in a dark setting.

The globe’s growing AI infrastructure requires huge amounts of electricity, possibly more than power providers can generate responsibly. Could AI models suck energy resources dry?

The fear: Demand for AI is skyrocketing, and with it the demand for energy to fuel training and inference. Power-hungry systems will overwhelm our current power sources. If unchecked, they could lead to energy shortages and runaway carbon emissions.

Horror stories: AI companies don’t disclose the percentage of their energy needs that AI consumes, but top companies, led by OpenAI, have pitched the U.S. government to build out new energy sources and infrastructure. The trend is clear: Escalating demand risks tapping out existing power plants, pushing carbon emissions higher, and delaying moves to more sustainable energy sources. 

  • A Goldman Sachs analysis predicts that data centers’ electricity needs will increase by 160 percent from 2023 to 2030. AI represents about one-fifth of this growth, or roughly 200 terawatt-hours each year. Wells Fargo forecasts greater consumption, 300 terawatt-hours in the U.S. alone by 2030. This could help boost energy demand in the U.S. by as much as 20 percent, leading electricity providers to increase their reliance on natural gas and other fossil fuels.
  • Demand for AI is reviving coal-fired plants that previously were laid to rest and reversing plans to decommission others. In Virginia and elsewhere, utility companies have delayed planned transitions to green energy to keep up with the AI boom. 
  • Each Nvidia GPU that uses the next-generation Blackwell architecture consumes nearly twice as much energy as a current top-of-the-line Nvidia H200. Nvidia is on track to manufacture 1.5 million of these units by 2027. According to one estimate, Nvidia servers alone could consume 85 to 134 terawatt-hours of electricity by 2027.
  • Tech giants that have pledged to reach zero net carbon emissions are falling behind their goals. Earlier this year, Google reported that its emissions of greenhouse gasses rose 48 percent in 2023 compared to 2019. Microsoft and Meta face similar challenges. All are using more low-carbon energy, but increases in overall energy consumption are pushing up their consumption of fossil fuels, too.
  • Amazon, Google, and Microsoft are investing in nuclear energy alongside solar and wind. The new nuclear plants are not expected to begin generating power until the 2030s.

How scared should you be: The rapid growth of AI poses a sharp dilemma: How can we meet demand without releasing greater and greater amounts of heat-trapping greenhouse gasses into the atmosphere? AI companies’ two-pronged strategy of lobbying governments and investing in carbon-free energy resources suggests the problem requires both short- and long-term approaches. 

Facing the fear: While AI poses a difficult problem for the world’s energy consumption, it’s also an important part of the solution. Learning algorithms are reducing energy consumption and managing distribution. They can help capture and store carbon dioxide from energy plants and manufacturers before it reaches the atmosphere. AI is also helping to monitor the atmosphere, oceans, and forests so we can understand the impacts of climate change and make policy accordingly. And processing in centralized data centers — as power-hungry as they are — is far more energy-efficient than using local servers or edge devices. Ongoing AI development will make such efforts more effective and help us build a more sustainable future.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox