print-icon
print-icon

The US Grid Wasn't Built For This

Tyler Durden's Photo
by Tyler Durden
Authored...

Authored by Tejasri Gururaj via Interesting Engineering,

Global data center power demand is projected to hit 84 GW by 2027—a 50 percent jump from 2023 levels—with AI workloads accounting for 27 percent of that total, according to Goldman Sachs Research.

The grid is strained by increasing demand from electricity-hungry data centers and electric vehicles.Getty Images

The grid cannot keep up with AI. For decades, electricity demand grew slowly and predictably, giving utilities comfortable margins to plan capacity years in advance. That model broke almost overnight. Between 2023 and 2024 alone, utilities’ five-year summer peak demand forecasts jumped from 38 GW to 128 GW, a more than threefold increase in a single planning cycle.

Unlike traditional server loads, which are relatively flat and predictable, AI inference and training jobs generate sharp, near-instantaneous power spikes. Large-scale GPU clusters can produce fluctuations of hundreds of megawatts within seconds. That’s a load behavior utilities have no historical model for.

Energy companies are no longer treating hyperscale data centers as large customers to be served from the grid, but rather as anchor infrastructure to be co-built with.

What follows is a look at what that shift actually demands at the systems level — why natural gas is currently the only tool that can fill the gap at the required speed and scale, what that means for emissions commitments already being made today, and what the longer path to balancing this with storage, transmission, and cleaner alternatives realistically looks like.

Why natural gas is filling the gap today

The US currently generates around 40 percent of its electricity from natural gas, with coal and renewables making up most of the rest. However, neither can meet the requirements of AI data centers, which require firm, uninterrupted, gigawatt-scale power available around the clock. The present US grid is already under strain before data centers even enter the equation.

U.S. power grid voltage levels and customer classes. Credit: United States Department of Energy/Wikimedia Commons.

Renewables hit a hard wall here. Interconnection requests for new solar and wind projects face median wait times of over four years. In contrast, natural gas is cheap, abundant, and already flows through an extensive pipeline network across the country. And unlike new solar or wind projects, gas plants can be up and running in three to five years.

Even so, three to five years is not immediate. Demand is here now, and the gap between what the grid can deliver today and what data centers need is already being felt. Energy companies are trying to figure out how to keep up with this demand in different ways.

Entergy is spending $3.2 billion to build three natural gas plants totalling 2.3 GW specifically to power Meta’s new Louisiana data center, which requires 2 GW for computation alone. These plants carry a typical operational lifetime of around 30 years.

Others are hedging their bets that the infrastructure will attract the tenant. NextEra Energy, the US’s largest renewable developer, is partnering with ExxonMobil to build a 1.2 GW gas plant in the Southeast. CEO John Ketchum summed up the industry’s new posture: the AI sector is shifting toward “BYOG” — build your own generation.

Rethinking the engineering playbook

Power grids are engineered for predictability. Seasonal peaks, industrial cycles, and population growth are modeled to plan generation capacity for the future. Fitting AI into this picture requires much more than just scaling.

Training a large language model means thousands of GPUs running simultaneously, sustaining enormous power draws for days or weeks, then dropping off sharply. These spikes are unpredictable and can be extreme. Dispatch curves determine which plants run when, whereas reserve scheduling ensures backup capacity is always available. AI workloads stress both in ways utilities have no historical model for. The forecasting crisis this has created is visible in the numbers, with a threefold increase in peak demand between 2023 and 2024. 

Developers routinely file speculative interconnection requests for projects that never get built, flooding queues with phantom demand. ERCOT, Texas’s grid operator, developed an entirely new Adjusted Large Load Forecast methodology to account for exactly this — the gap between projected data center load and what actually materializes.

At the plant level, this is forcing a redesign of how generation assets are dispatched. When an AI model responds to a user query, it triggers a sudden, large power surge known as an inference spike. Gas peakers — plants designed for short, high-output bursts — are now being co-located with data center campuses specifically to absorb these inference spikes that baseload plants can’t respond to fast enough.

The DOE’s National Transmission Needs Study identified transmission congestion as already acute across multiple regions before this wave of demand arrived.

Transmission and cost crunch

The physical grid is buckling under the same pressure. Transmission investment in many regions of the US declined steadily after 2015, leaving a system already running close to its limits. Now it’s being asked to absorb demand at a scale it was never designed for.

In Texas, CenterPoint Energy reported a 700% increase in large load interconnection requests between late 2023 and late 2024. In Virginia, another 50 GW of data center projects sit active in the queue. The costs reflect the strain.

Combined-cycle gas turbines (CCGTs) capture waste heat to generate additional electricity, making them efficient enough for round-the-clock demand. Installed costs for new CCGTs have nearly doubled to around $2,000/kW compared to plants built just a few years ago.

The market data tells the same story. The capacity market clearing price, which is the rate utilities pay to secure guaranteed power reserves for peak demand, has also increased. In PJM, the grid operator covering much of the Mid-Atlantic and Midwest, capacity market clearing prices for the 2026-27 delivery year jumped to $329/MW — more than ten times the $28.92/MW price from two years prior.  

A map of the U.S. high-voltage transmission grid. Credit: Wikideas1/Wikimedia Commons.

The long game : Emission costs

The gas plants being built today aren’t just a bridge to the AI boom; they’re a commitment. With an average operational lifetime of 30 years, they will still be running well past every major net-zero target on the books.

A natural gas plant emits around 490g of CO2 per kilowatt-hour over its lifetime. Scale that across the gigawatts of new capacity being greenlit today, and the emissions math becomes difficult to ignore.

Across the southern US, utilities are planning around 20 GW of new gas capacity over the next 15 years, with data centers accounting for 65 to 85% of projected load growth in Virginia, South Carolina, and Georgia alone. The methane problem compounds this.

Natural gas infrastructure (drilling, pipelines, compression) leaks methane continuously, both accidentally and through intentional venting. Methane traps around 80 times as much heat as CO2 over a 20-year horizon, making the emissions from a buildout of this scale difficult to quantify but impossible to ignore.

It’s the policy fault line that’s now opening up between energy companies, hyperscalers with net-zero commitments, and regulators who are only beginning to grapple with what AI’s energy appetite actually means for decarbonization timelines.

Policies and incentives

Several structural mechanisms are being put in place to eventually shift the balance, though none of them work fast enough to solve the immediate problem.

It’s the policy fault line that’s now opening up between energy companies, hyperscalers with net-zero commitments, and regulators who are only beginning to grapple with what AI’s energy appetite actually means for decarbonization timelines.

Policies and incentives

Several structural mechanisms are being put in place to eventually shift the balance, though none of them work fast enough to solve the immediate problem.

On the storage side, the Inflation Reduction Act of 2022 offers a 30% tax credit for standalone energy storage systems and zero-emission generation facilities placed in service after 2024. The credit applies not just to generation technologies like solar but also to storage infrastructure itself. This gives data center operators and utilities a financial reason to invest in battery systems needed to make renewables work around the clock.

On the generation side, nuclear is emerging as a leading zero-carbon option for AI data centers, given its ability to deliver firm, always-on power. Google is already moving in this direction, striking a deal with NextEra to restart the 615 MW Duane Arnold nuclear facility for 24/7 carbon-free power.

Transmission remains the hardest problem. A study by the Department of Energy identified significant transmission capacity gaps across nearly every US region — gaps that predate the AI demand surge and will take years of coordinated investment and permitting reform to close.

The path forward

AI’s power demands are arriving faster than the infrastructure built to serve them. The gas plants, the transmission upgrades, the storage credits, the nuclear restarts, none of it is moving at the speed the technology is. 

At some point, that gap has to close. The question is whether that gap closes through deliberate investment and policy coordination, or through something more painful. Power shortages, delayed data centers, and electricity bills that reflect the true cost of building a grid that wasn’t designed for this moment. Engineers and policymakers are working on the former. The clock is running on the latter.

0