

Some like to imagine that the digital world has freed us from the physical one, that computing lives somewhere above the friction of pipelines and permits and power lines, in a clean, abstract place we have named, without irony, the cloud. That illusion is dispelled when a transformer shortage causes a delay in the construction of data centers.
To give a sense of the amount of energy throughput required to keep building, the following numbers come from federal laboratories and utility research organizations: In 2023, American data centers consumed approximately 176 terawatt-hours of electricity, or 4.4% of all the energy the country used. By 2028, that figure is expected to reach somewhere between 325 and 580 terawatt-hours. By 2030, data centers could account for between 9% and 17% of national electricity consumption. Virginia, which already hosts more data-center capacity than any other state, could find itself directing between 39% and 57% of its electricity to the machines by decade’s end.
The political consequences are becoming visible at an unsettling pace.
What Sightline Climate reported in February 2026 is clarifying in its plain arithmetic: At least 16 gigawatts of data-center capacity were supposed to come online in the United States during 2026. Of that, roughly five gigawatts were under construction. The remaining 11 gigawatts had been announced but showed no construction progress. Given this delay, between 30% and 50% of the year’s projected capacity is unlikely to exist by year’s end. The buildings are failing to rise not because of any shortage of ambition or capital, but because the electrical systems that would bring them to life are themselves a constrained resource.
Tech acceleration, electricity slowdown
The technology industry tends to describe its obstacles as temporary inconveniences, friction to be optimized away. The power problem is not that kind of obstacle. A modern data center, before it is a real-estate asset or a monument to computational ambition, is an electrical system. IT equipment can represent 95% of its total demand. Cooling systems are themselves power-electronics loads. For the highest-performing AI facilities, operators have begun to omit traditional backup protection for their servers, relying instead on software checkpointing and restart logic, because the weight of the electrical infrastructure has become something to work around. “Construction complete” does not mean the facility can be turned on. Completion requires that the utility interface, the internal electrical architecture, the backup strategy, and the thermal system are synchronized and tested. The construction is the least of it.
The physical bottleneck is in a part of the supply chain that is rarely mentioned. Distribution transformers, the equipment that turns high-voltage transmission power into the voltages that buildings can actually use, are in short supply. Lead times that ran three to six months in 2019 stretched to 12 to 30 months by 2023. Large power transformers are custom-made, difficult to substitute, expensive to stockpile, and dependent on grain-oriented electrical steel, aluminum, and copper, all of which faced their own post-pandemic constraints. When a hyperscale campus needs utility service, substation capacity, and specialized transformer equipment at the same moment as the broader grid, delay is the usual result.
RELATED: How AI could decide the midterms — with $200 million to sway your vote
Wiktor Szymanowicz/Future Publishing/Getty Images
Why they weren't ready
A relevant historical parallel is the railroad age. There is a structural similarity beyond the lazy metaphor that “data centers are the new railroads.” The railroads began as a private development wave and became a problem of political economy. They forced changes in public regulation, organizational form, and the distribution of costs and benefits that their builders had not anticipated. The data center is following a similar path. American electricity demand is being reshaped at a pace unseen since the postwar industrial boom, but with a crucial difference: Today’s growth arrives in enormous concentrated parcels, in specific counties, on venture-capital timescales. The grid, in contrast, expands on utility and regulatory timescales. These are not the same.
The political consequences are becoming visible at an unsettling pace. The Federal Energy Regulatory Commission ordered PJM Interconnection, a regional grid operator, in December 2025 to write clearer rules for serving AI-driven data centers. The North American Electric Reliability Corporation reported in 2025 that 13 of its 23 assessment areas face resource-adequacy challenges over the next decade. The Energy Information Administration announced in March 2026 that it was launching pilot studies on data-center energy use, covering electricity consumption, cooling systems, server metrics, and site characteristics. For two decades, data centers were background infrastructure. The regulatory apparatus of the federal government now wants new instruments to see them clearly.
The grid at a crossroads
Maine, in April 2026, approved the first statewide moratorium on large-scale data centers in the United States, halting approvals for facilities above 20 megawatts while a state council studies grid, air, water, and cost impacts. Only Democrat Gov. Janet Mills' veto stopped the push (for now). In Mississippi, a lawsuit accused a major AI company of operating gas turbines near Memphis without the required permits, the speed-to-power logic having collided with environmental permitting. In March 2026, the Trump administration announced a pledge under which major hyperscalers agreed to build or buy new generation and cover the cost of power-delivery upgrades rather than passing those costs to households. Whatever the durability of that commitment, the political signal is clear: Once officials begin publicly assuring households that they will not be asked to subsidize AI infrastructure, the issue has moved from sectoral regulation to the politics of fairness.
The “cloud” always involved a rhetorical stance. It described a physical system as if geography, electricity, and equipment lead times were incidental to it. The transformer shortage, the interconnection queue, and the emergency turbines pierce through that description. The internet reappears as pipes, wires, substations, permits, emissions, and cost-allocation fights. It arrives in a specific county, draws on a specific grid, and asks specific communities to absorb consequences that were designed, by the grammar of cloud computing, to belong to no one.
Delay is the form this revelation takes. It forces governments to decide what may be built, at whose cost, and on whose timeline.
.png)
1 hour ago
1















English (US)