Artificial intelligence is being used to safeguard utility infrastructure, advance the cutting edge of renewable energy research and help permit clean energy projects — but simultaneously, the technology is threatening to strain the grid with massive load growth.
The AI-driven data center boom could account for 44% of U.S. electricity load growth from 2023 to 2028, Bain & Company said in an October report, and “meeting demand would require utilities to boost annual generation by up to 26% by 2028.”
“We have had a period of relatively flat demand for electricity going on almost two decades now,” said Neil Chatterjee, a former chairman of the Federal Energy Regulatory Commission who last week joined the advisory board for AiDash, a satellite analytics company that uses AI to remotely monitor utility infrastructure. “I don't know that policy makers, regulators, politicians or industry are prepared for the surge in demand that is coming.”
Chatterjee said that one of the reasons he’s proud to associate himself with AiDash is that the company provides an example of how AI “can actually advance the clean energy transition and mitigate climate change risk” by better enabling utilities to address risks and challenges like vegetation management, which costs them $6 billion to 8 billion annually.
“I really think the climate case for AI needs to be made,” he said.
The U.S. has around 7 million miles of power lines, “[more than] 200 million poles and billions of trees,” said AiDash co-founder and CEO Abhishek Singh. “With an increasingly changing climate, all these assets are exposed with greater risks than they were five years back. And more importantly, the labor conditions today, in terms of how many people we have to work — there's not enough manpower to inspect these assets more and more frequently at scale.”
Recent breakthroughs in generative AI are what allow AiDash to work, Singh said. Before 2019, this type of AI use was “not possible.”
At the same time, the AI-driven surge in electricity demand is raising “a lot of complex questions,” Chatterjee said, like the issue of data center co-location.
Last month, FERC rejected an amended interconnection service agreement that would “essentially allow for powering a data center behind the meter in the footprint of an existing nuclear plant,” Chatterjee said.
While the deal would have provided the data center with power from a carbon-free baseload source, “by taking it behind the meter, you are now potentially increasing costs to ratepayers at a time when capacity is constrained and therefore are spreading out higher costs among fewer ratepayers,” he said. “And then if that deal — precedent-setting as it would be — would lead to a flood of other co-location deals coming forward, then you might potentially have a resource adequacy challenge.”
“I think we need leadership at the highest level on what our approach to AI adaptation will be,” Chatterjee said. “I don't think it's fair, quite frankly, that my former colleagues at FERC are caught in the middle of this dilemma over data center co-location agreements, because it's kind of a lose-lose.”
However, he said, he’s optimistic because the dilemma ultimately “comes down to a math equation” of meeting the surge in demand and maintaining reliability and affordability without backsliding on the decarbonization goals the U.S. has established.
“It’s achievable,” Chatterjee said.
Help with permitting, siting and risk assessment
When the first ChatGPT model came out in November 2022, software platform company Paces was around six months old and was working to build a permitting insights tool using conventional AI, said co-founder and CEO James McWalter.
“We actually were able to build an initial dataset, but it took a lot of money, a lot of time, and immediately it was out of date,” McWalter said. “We could not figure out a way to scale it appropriately, to service customers at a reasonable cost. So we actually paused that project completely, and we hadn't really planned to reapproach it.”
Paces came back to the project after ChatGPT’s first release, McWalter said, and is now using AI tools to help clean energy developers accelerate their projects by providing assistance with permitting, siting, interconnection and environmental risk assessment.
The company is able to use AI for large data collection and classification tasks, as well as producing and submitting reports.
“Our view is that within two years, pretty much all desktop analysis should be able to be automated to a very high fidelity,” McWalter said, so project developers can focus on tasks like “[attending] a town hall meeting, building a relationship with a utility — all those really, really important things that often they don't honestly have enough time to do.”
In September, McWalter attended a roundtable hosted by the White House which invited leaders from AI companies, hyperscalers and utilities to meet and discuss U.S. leadership in AI infrastructure, and “[consider] strategies to meet clean energy, permitting, and workforce requirements” for meeting AI’s energy needs.
“There are definitely those kinds of conversations happening,” he said. “We'll see with the new administration, if that does continue, but there are definitely things being taken into account by folks at the [Department of Energy] and other government agencies.”
Paces’ mission “is to build as much clean generation [as possible] and move the entire economy off fossil fuels,” McWalter said. “So one of the things we are currently working on is enabling large load centers, like AI data centers, to have additional optionality with things like large behind-the-meter generation, co-located generation and so on.”
“Our view is that a lot of the kind of large scale load of data centers, et cetera, are basically just going to use a lot of natural gas, unless we figure out a way to add renewables into the mix,” he said.
Assisting the development of nuclear fusion
AI is also one of the new tools helping to speed research into the commercialization of nuclear fusion as an energy source, according to a November report from the Clean Air Task Force.
“I'm optimistic because there are a number of enabling technologies that are making fusion possible now — before it was much more difficult,” said Sehila Gonzales de Vicente, CATF’s global fusion director and the paper’s lead author. “One of them is AI.”
There are a number of aspects in fusion “where artificial intelligence can play a really important role and change the game in the sense that before, we were not able to address these problems,” Gonzales de Vicente said. “There was not an easy way to solve them.”
One such problem is maintaining the stability of the plasma that fusion reactions take place inside of. A large instability, or disruption, in the plasma will end any fusion reaction taking place.
“To understand the nature of the disruptions, to understand why they are produced and to prevent them was really difficult,” Gonzales de Vicente said. “The approach was to mitigate them instead of prevent them.”
One core aspect of a “multi-institutional collaboration to develop a Fusion Data Platform for Machine Learning applications using Magnetic Fusion Energy” led by the Massachusetts Institute of Technology is DisruptionPy, “an open-source Python library designed to retrieve, process, and analyze data related to plasma disruptions,” according to the paper. “A key strength of DisruptionPy is its ability to streamline data retrieval and generate large, validated datasets for [AI] applications.”
“To be able to predict where these disruptions will take place, that is an inhuman task,” Gonzales de Vicente said. “A human being is not able to answer and to solve that.”
With advancements in artificial intelligence and high performance computing, far more optimized modeling can be done inside of a computer, or in silico, she said. “You still need to build machines, but you need to build less machines, and you can optimize your parameters in a much better way.”
“Artificial intelligence is a tool which is quite transversal to basically everything. Everything related to production design, optimization of the designs, testing designs is in silico,” Gonzales de Vicente said. “Now you have the possibility of not having to build whatever piece of equipment or a whole plant. You can test it in silico. Imagine this, the savings in terms of time and money and the level of optimization that you end up with when you build that equipment. Before, you couldn't have it.”
Gonzales de Vicente said she thinks artificial intelligence is useful to all industrial processes and instances where high-tech equipment is necessary, thanks largely to its ability to optimize operations.
“In the next five years, we will have better tools. And these tools will be better developed, and then we’ll have new tools,” she said. “And that was the problem of fusion. Fusion is a very complex technology, using very difficult to design and build machines. And now is the first time that we have all these tools in our hands to make it real.”