AI Expansion Highlights Dangers Of America's Aging Power Grid
Authored by Autumn Spredemann via The Epoch Times (emphasis ours),
America’s artificial intelligence (AI) boom is colliding with an older, slower-moving entity: the nation’s aging electrical grid.

From Virginia’s data-center corridor to multi-state electricity markets, analysts, government agencies, and AI insiders say the scramble to handle technology’s expanding power demands will be an uphill battle.
At the same time, big tech companies and data centers are working to reduce the impact of AI’s expansion on the United States’ grid infrastructure. Some experts believe changes and significant investment are needed to reduce grid stress and possible energy shortages.
A primary driver of this concern is the explosion in data centers being built across the United States to support rapid AI buildout. In 2024, energy consumption reached an all-time high, according to the U.S. Energy Information Administration. The agency expects 2025 and 2026 consumption to be even higher.
Presently, energy demands from data centers account for about 4 percent of electricity use in the United States and 1.5 percent of the world’s electricity use, EIA data show.
Although AI’s current computing needs represent just a fraction of total energy consumption, the rate of growth has raised the question of whether the United States’ energy infrastructure can keep up.
A significant portion of America’s power grid network dates back to the 1960s and 1970s, according to the Department of Energy. As of 2023, the agency observed that 70 percent of transmission lines were more than 25 years old and nearing the end of their lifecycles.
“This has major consequences on our communities: power outages, susceptibility to cyberattacks, or community emergencies caused by faulty grid infrastructure,” the agency stated.
And that’s without any added energy demands.
The Energy Department’s Grid Deployment Office has awarded $14.5 billion in grants to improve electrical infrastructure, according to Bank of America research, which also indicates that an additional $36.9 billion in private sector investments to U.S. grid upgrades have been made over the past couple of years.
The Bank of America analysis noted the United States is going through a period of power “load growth” primarily driven by building electrification, data centers, industrial demand, and the rise of electric vehicles (EVs).
“If load growth forecasts continue to rise, utilities will need to invest to meet required reserve margins and increase spending on both power generation and transmission and distribution capacity,” the July report said.
Perfect Storm
Even after two years of modernization efforts, the U.S. power grid network remains in a race to continue upgrading while consumption demand surges. Due to data center growth, researchers at S&P Global expect power grid requirements to increase 22 percent by the end of this year and nearly three times by 2030.
“People keep saying the lack of chips is the problem, and it’s not. It’s a lack of power,” Tyler Saltsman, CEO of Seattle-based EdgeRunner AI, told The Epoch Times.
Part of the conversation surrounding unsustainable AI growth in recent months homes in on structural shifts in support sectors.

A Rand Technology analysis called graphics processing units, high-performance memory, and networking integrated circuits the “bedrock of AI infrastructure.” The demand for these components is rising faster than suppliers can deliver.
However, Saltsman believes a shortage of microchips is moot if the power grid can’t support AI’s rapid buildout.
Working at the intersection of AI and energy, Saltsman’s company has three active research and development contracts with the U.S. military. From his perspective, alarm over AI and U.S. energy infrastructure isn’t overstated.
“If anything, it’s downplayed. Our grid is pretty fried ... nationwide, you see a lot of lazy [maintenance] practices,” Saltsman said.
While he hasn’t encountered any power-related issues while working on the front lines of AI, Saltsman said he expects to if data center growth continues at the current rate.
“We can make chips much faster than we can make power,” he added.
When asked what could be done to safeguard U.S. power grids, Saltsman said, “We need to commit to building nuclear reactors, and we need to do it now, but that isn’t a quick fix.”
On average, a nuclear power plant takes more than five years to build, according to the World Nuclear Association.
Meanwhile, some energy experts believe concerns over AI and power demands are legitimate, but aren’t being framed correctly.
“The risk isn’t that AI will ‘break’ the U.S. grid, rather the risk is that outdated planning, cost-allocation rules, and inflexible load assumptions will force inefficient solutions like emergency peakers or deferred retirements despite smarter and cleaner alternatives that exist,” Gaurav Shah, managing partner at Trident Renewables, told The Epoch Times.
Emergency peakers are peak demand power plants that act as quick-start power generators that supply electricity to a grid during times of unexpectedly high demand. Incidents such as extreme weather events or power failures from other sources are often the impetus for their use.
Despite the relatively small portion of America’s total electricity consumption for which AI is responsible, energy demand growth has been enough to require the use of peaker plants.
Peaker plants contribute about 3 percent to the country’s electricity use, but have the capacity to produce 19 percent, according to a 2024 report by the Government Accountability Office.
“There are a ton of peaker plants that could operate more,” Energy Secretary Chris Wright told Reuters in an interview in September.
Shah has spent nearly 20 years working with U.S. energy infrastructure, including renewable energy, grid-connected assets, fuel transition projects, and, most recently, AI-linked energy strategy.
“This is a governance and market-design challenge more than a physics problem,” he said.
“The grid struggles with concentrated AI clusters in places like Northern Virginia, Texas, and parts of the Southeast,not because power doesn’t exist but because deliverability, redundancy, and timing don’t align,” Shah explained.
“Reforms like faster permitting for transmission upgrades and incentives for siting data centers near retiring industrial sites with existing grid headroom are much needed,” he said. “Without reforms, we are likely to see higher costs, delayed retirements of older plants, and localized reliability stress.”
“With the increase in EVs, it’s a perfect storm of factors,” Saltsman said.

He believes AI has the potential to be dangerous for U.S. electrical infrastructure. With the power grids already stressed and in need of upgrades, sudden surges in power loads—or even a rogue AI agent—could tip the scales for the worse.
“If you were to attack our power grid, you could potentially bring this country to its knees,” Saltsman said.
Regional Challenges
Shah said AI’s energy footprint is “hyperlocal,” and power grids will likely fail locally, not nationally.
He said a 100 megawatt data center in a congested area can cause more stress than 1 gigawatt of overall national growth in power demand.
Energy grids in the United States are broken down into different sections instead of a seamless power supply. Most of these subgrids are part of the Eastern Interconnection, the Electric Reliability Council of Texas (ERCOT), or the Western Interconnection.
The Pennsylvania-New Jersey-Maryland (PJM) Interconnection serves what’s known as “data center alley” in Virginia, which is currently experiencing unprecedented data center growth alongside soaring energy demands, according to PJM Inside Lines.
Officials for the PJM Interconnection warned that an energy capacity shortage could affect its systems as early as June 2026.
“The demand for electricity is growing at the fastest pace in years, primarily from the proliferation of data centers, electrification of buildings and vehicles, and manufacturing,” the agency stated.
“Regions like ERCOT and PJM face different challenges. Texas has generation but not transmission constraints. The Northeast has aging infrastructure and limited siting options. AI load growth is geographically concentrated, capital-intensive, and fast,” Shah explained.
“National averages hide the fact that a single county can suddenly need the equivalent of a mid-sized city’s power demand. Planning frameworks were not built for this,” he said.
Big tech companies are well aware they’re in the hot seat when it comes to data center energy consumption, which is why many are rapidly adopting more energy-efficient practices to reduce their load demands. Companies such as Amazon, Google, Meta, and Microsoft are among the top purchasers of renewable energy, which amounted to nearly as much as the entire state of Florida last year, according to an annual report by the American Clean Power Association.
Major players in tech are investing in multiple strategies to blunt the impact of data center-related power demand spikes, including energy-efficient hardware, advanced cooling systems, and power management systems, according to NZero and Flexential.
Saltsman said with the current rate of AI expansion, it’s “not going to be a pretty sight … unless you also plan to build a power plant in that same area.”
“We need a unified plan on modernizing the grid,” he said.
