AI models and the data centers filled with chips to power them require an enormous amount of power. And the industry is scrambling to keep pace.
After a decade of largely flat electricity needs, somewhere around 50 gigawatts of new power capacity — or enough to run roughly 40 million homes — will be required in the US to sustain the AI boom, according to Goldman Sachs.
But should some of the marquee deals Big Tech giants are signing not materialize, utility companies and their customers could be stuck footing the bill.
“Many of the [grid connection requests] appear to be from a developer that is proposing data centers in multiple utility service territories looking for, ‘Where can I connect the fastest? Where can I get a deal?'” said Brendan Pierpont, director of electricity modeling at the research firm Energy Innovation.
But the speed-at-all-costs approach has its risks. “What are the long-term business models? How much compute will those services actually require? [There’s] just huge amounts of uncertainty about that whole space,” Pierpont said.
The process for turning power demand into power generation takes years.
When a utility receives a power load request, such as from a tech company looking for, say, 2 gigawatts for a new data center, the utility spends millions buying the equipment, materials, and hiring the personnel to make it happen.
Should demand ultimately fall short of estimates, utilities can be stuck with stranded assets generating no revenue. Their options then are to find a way to pass that cost — which averages around $102 per kilowatt, or $102 million for a 1-gigawatt load — on to ratepayers or write down the loss themselves.
AI has pushed a bevy of tech companies into deals with utilities across the country, but some cracks have started to show in these best-laid plans as the AI boom rapidly evolves.
Microsoft (MSFT), one of the largest data center developers in the country, decided in March to walk away from proposed data center projects in the US and Europe with a combined 2-gigawatt load, according to Bloomberg.
While it is unclear whether utilities had begun to spend money building out connections for those projects, it is evidence of the potential threat that TD Cowen analysts attributed to an oversupply of the computers that power AI technology.
This past week, Monitoring Analytics, the independent market monitor for PJM Interconnection, filed a brief with the Federal Energy Regulatory Commission arguing that the federal regulator should reject a recently signed transmission agreement between Pennsylvania utility PECO Energy and Amazon’s (AMZN) Data Services division.
AI models and the data centers filled with chips to power them require an enormous amount of power. And the industry is scrambling to keep pace.
After a decade of largely flat electricity needs, somewhere around 50 gigawatts of new power capacity — or enough to run roughly 40 million homes — will be required in the US to sustain the AI boom, according to Goldman Sachs.
But should some of the marquee deals Big Tech giants are signing not materialize, utility companies and their customers could be stuck footing the bill.
“Many of the [grid connection requests] appear to be from a developer that is proposing data centers in multiple utility service territories looking for, ‘Where can I connect the fastest? Where can I get a deal?'” said Brendan Pierpont, director of electricity modeling at the research firm Energy Innovation.
But the speed-at-all-costs approach has its risks. “What are the long-term business models? How much compute will those services actually require? [There’s] just huge amounts of uncertainty about that whole space,” Pierpont said.
The process for turning power demand into power generation takes years.
When a utility receives a power load request, such as from a tech company looking for, say, 2 gigawatts for a new data center, the utility spends millions buying the equipment, materials, and hiring the personnel to make it happen.
Should demand ultimately fall short of estimates, utilities can be stuck with stranded assets generating no revenue. Their options then are to find a way to pass that cost — which averages around $102 per kilowatt, or $102 million for a 1-gigawatt load — on to ratepayers or write down the loss themselves.
AI has pushed a bevy of tech companies into deals with utilities across the country, but some cracks have started to show in these best-laid plans as the AI boom rapidly evolves.
Microsoft (MSFT), one of the largest data center developers in the country, decided in March to walk away from proposed data center projects in the US and Europe with a combined 2-gigawatt load, according to Bloomberg.
While it is unclear whether utilities had begun to spend money building out connections for those projects, it is evidence of the potential threat that TD Cowen analysts attributed to an oversupply of the computers that power AI technology.
This past week, Monitoring Analytics, the independent market monitor for PJM Interconnection, filed a brief with the Federal Energy Regulatory Commission arguing that the federal regulator should reject a recently signed transmission agreement between Pennsylvania utility PECO Energy and Amazon’s (AMZN) Data Services division.
Leave feedback about this