Axios, Bloomberg, and the Financial Times contributed to this report.
The AI boom is setting off a quiet but high-stakes battle over electricity — and it’s moving fast.
At the center of it: how to power the massive data centers feeding artificial intelligence. Do companies plug into the public grid like everyone else, or build their own self-contained energy systems and go it alone?
That question is quickly turning into a defining fight for the future of both tech and energy.
Demand is exploding. Today’s data centers don’t just sip electricity — they gulp it, often consuming as much power as entire cities. And with billions pouring into AI infrastructure, the pressure to get these facilities up and running is intense.
That urgency is already reshaping decisions on the ground. Chevron, for example, is working on a plan to build a dedicated natural gas plant just for a Microsoft data center in Texas. It’s one of several signs that “energy islands” — self-powered facilities — are gaining traction.
Not long ago, that idea barely registered. Now, roughly 30% of planned data center power capacity is expected to come from on-site generation, according to Cleanview. Some analysts think that share could climb much higher, maybe even hitting 50% if current trends hold.
The appeal is simple: speed.
Hooking up to the grid can take years. There are queues, regulatory hurdles, and a growing backlog as utilities struggle to keep up with demand. Building power on-site cuts through that. Companies get control, faster timelines, and fewer bottlenecks.
For developers racing to deploy AI infrastructure, that’s hard to ignore.
“Speed is the competitive currency,” said Crusoe co-founder Cully Cavness recently.
In some cases, he added, going off-grid is just quicker — and that’s what matters.
There’s also a side benefit. Keeping these energy-hungry facilities off the grid could ease pressure on local electricity systems, shielding households and businesses from potential strain or price spikes. Natural gas companies, in particular, are leaning into that argument.
But not everyone’s sold.
Critics say cutting data centers off from the grid could backfire. A shared system spreads costs and offers backup when something goes wrong. Standalone setups have to overbuild — more capacity, more redundancy — just to match that reliability.
That can get expensive, fast.
“If we decouple AI from the grid, everybody loses,” said EmeraldAI founder Varun Sivaram.
Higher costs for AI companies. Lost revenue opportunities for utilities. A weaker overall system.
Even some tech players are wary. Google’s energy team has warned that building isolated systems means duplicating infrastructure, which can drive up costs without delivering long-term benefits.
In reality, the future may land somewhere in between.
Some companies could start off-grid to get projects moving, then connect later once the grid catches up. Others may stay hybrid, keeping one foot in each camp.
NextEra Energy CEO John Ketchum put it bluntly: most big tech players will eventually want “an extension cord” to the grid.
Meanwhile, regulators are scrambling to keep pace. Federal officials have already pushed for new rules governing how data centers pair with power plants, and more changes are likely as the boom accelerates.
Still, policy can only do so much.
Companies with deep pockets aren’t waiting around. If they need power, they’ll build it — right next to their servers if necessary. Decisions about whether to connect later can come down the line.
That reality is already shifting the balance. Governments move slowly. Tech companies don’t.
And as the AI race heats up, speed is winning.
There’s another wrinkle: public pushback. A recent Harvard/MIT survey found mixed support for new data centers. About 40% of people are on board, while roughly a third oppose them.
Electricity prices are part of the concern, but not the main one. People are more worried about how these massive facilities will change their communities — noise, land use, and overall quality of life.
At the same time, states are starting to act. Some are considering charging data centers higher electricity rates to offset the strain they place on the system. Others are rethinking how costs are shared.
Behind it all is a deeper problem: the grid itself is struggling to keep up.
A shortage of critical equipment — transformers, switchgear, batteries — is slowing projects and driving up costs. Delivery timelines that once took two years can now stretch to five. Developers are even refurbishing old equipment just to stay on schedule.
The result is a scramble. Money isn’t the issue. Supply chains are.
And looming over everything is a bigger question: are we building too much, too fast?
Some investors warn the rush to power AI could lead to overbuilding — plants that end up underused if demand doesn’t match expectations or if technology becomes more efficient.
That risk cuts both ways. Extra capacity could eventually feed back into the grid, boosting supply and lowering costs. Or it could leave someone footing the bill for infrastructure that isn’t fully needed.
Either way, the stakes are enormous.
AI’s growth isn’t slowing down. But the energy needed to sustain it — and how that energy gets delivered — is still very much up for grabs.









The latest news in your social feeds
Subscribe to our social media platforms to stay tuned