Drive a few miles outside Cheyenne, Wyoming, and the landscape looks familiar at first: hay fields, cattle, long views. Then the horizon breaks – concrete shells the size of shopping malls, ringed by transformers, diesel tanks, and brand-new substations. The locals still call it ranch country. On Wall Street and in Silicon Valley, they call it “AI infrastructure.”
This is the Great Build-Out: a global rush to pave over fields with data centers so that artificial intelligence, cloud computing, and our increasingly online lives have somewhere to live.
Economically, it looks like a gold rush. Politically, it’s becoming a street fight. And for states like Wyoming, it’s either a once-in-a-generation industrial investment… or a very expensive mirage.
Let’s unpack the economics behind this build-out – and what it really means for energy bills, jobs, and communities.

We have been tracking this story, documenting how “hyperscale” data centers – built for giants like Microsoft, Meta, and OpenAI – are suddenly landing in places that used to host cows, coal mines, or nothing much at all.
Their reporting echoes a broader pattern:
Business Insider used environmental permits and land records to map hundreds of existing and proposed US data centers tied to the AI boom, stretching from Northern Virginia and Georgia to Texas, Ohio, and the Mountain West.
Industry registries like DataCenterMap now list multiple active and planned facilities even in historically low-tech states like Wyoming, with clusters around Cheyenne and other grid-connected towns.
A global visualization by Visual Capitalist shows dense “constellations” of facilities in the United States and Western Europe, with fast-growing hubs in Asia, Latin America, and the Middle East.
So what’s actually being built?
Economically, a modern AI-ready data center looks less like an office building and more like a power plant wearing a hoodie:
Construction itself needs billions of dollars in land, concrete, steel, cooling systems, and – most importantly – racks of GPUs that cost more per rack than many houses.
The required energy is also staggering. A single large campus can demand 200–300 megawatts or more – roughly the peak load of a small US city or even an entire rural state.
But the story does not end on energy. market researchers at GMI forecast the data center cooling sector – chillers, liquid cooling, heat reuse systems – to expand rapidly over the next decade as AI pushes chip densities and temperatures higher.

It’s no surprise, then, that US construction firms are quietly reorganizing around this demand. Industry coverage notes that after a broader construction slowdown, contractors now “firmly prioritize” data centers as a leading growth segment, with specialized builders increasingly focused on hyperscale campuses and associated grid upgrades.
Brookings, in a detailed look at the “future of data centers,” describes them as capital-intensive, land-hungry, and labor-light – a new kind of infrastructure that underpins everything from banking to streaming to AI, but doesn’t look much like the smokestack industries of the past.
In places like Wyoming, that mix – big construction money, modest long-term employment, heavy utility needs – is exactly why the politics are getting complicated.
Is the build-out actually necessary? From the perspective of big tech, the answer is almost insultingly simple: yes, the build-out is necessary. Their earnings calls and public messaging basically boil down to: no data centers, no AI.
Recent reporting from CNN, CNBC, and The New York Times shows the same pattern in quarterly results: Microsoft, Amazon, Google, and Meta are pouring tens of billions of dollars per quarter into AI chips and the data centers to house them, even as some investors worry about margins.
The companies themselves are happy to spell out the scale:
Meta says it plans to commit over $600 billion in the US by 2028 for AI technology, infrastructure, and workforce expansion. The company claims its data center projects since 2010 have supported over 30,000 skilled-trade jobs and about 5,000 operational jobs, and that it has helped enable around 15 gigawatts of new energy on US grids through its investments and power contracts.
OpenAI, in a letter to the White House about its “Stargate” mega-campus, argues that a $1 trillion wave of AI infrastructure could add more than five percentage points to US GDP growth over three years. But it also warns that making this possible could require the US to add 100 gigawatts of new power capacity every year – roughly twice what the country added in 2024 – and potentially mobilize one-fifth of the nation’s skilled trades workforce.
A Fortune piece on Stargate notes that OpenAI, Nvidia, and Oracle are collaborating on the complex, raising antitrust questions about whether rivals jointly planning such massive infrastructure crosses 135 years of US antitrust precedent.
At the same time, a Bloomberg feature and Reuters reporting highlight mounting concerns from grid operators and utilities: AI data centers are emerging as one of the fastest-growing sources of electricity demand worldwide, forcing planners to re-do decade-old assumptions about flat or modest demand growth.

Utilities, in turn, see both risk and opportunity. Business Insider describes an unfolding “AI power bubble”: some utilities are using very aggressive demand forecasts from data center developers to justify massive new generation and transmission projects – costs that could show up in ordinary customers’ electric bills even if not all of the proposed data centers are actually built. In one comparison, McKinsey estimates global data centers will need about 219 gigawatts by 2030, while 26 large US utilities say they are planning for data center projects totaling over 700 gigawatts – almost equivalent to current continental US summer peak demand.
From a strictly corporate accounting angle, there’s a debate about whether this is sustainable or a bubble:
Wired leans toward the “bubble” view, arguing that AI investing ticks all the historical boxes: huge uncertainty, powerful narratives, equity hype, and heavy leverage – more aviation-and-radio-in-the-1920s than responsible infrastructure spending.
Another Business Insider analysis pushes back on one popular worry – that GPUs will become obsolete so fast that data centers are stuck with billions of dollars of dead hardware. Interviews with chip analysts and cloud operators suggest that GPUs stay profitable for five to eight years as they migrate from cutting-edge training to less demanding inference workloads, which makes current depreciation schedules look more reasonable.
In short: tech companies see data center build-out as a prerequisite for AI; skeptics worry that we’re building ahead of demand and will end up with a lot of stranded assets and higher utility bills.
Policy analysts at ITIF argue that, like it or not, data centers should be treated as critical infrastructure, comparable to ports or highways: the US needs them to stay competitive, and the real question is how to integrate their energy demand into a cleaner, more reliable grid.
That question – how, not whether – leads directly into the fight now playing out between communities, companies, and governments.
The Great Build-Out Part 2 documented just how political this has become: zoning fights, county-level elections decided by data center issues, and pressure on state officials to either “roll out the red carpet” or “hit pause” on new projects.
In Washington, the Bipartisan Policy Center has laid out a roadmap of “strategic federal actions” that link AI policy to energy infrastructure: speeding up transmission permitting, funding grid modernization, and aligning data center planning with national security and climate goals.
At the same time, members of Congress are starting to talk about guardrails. Politico reports that Rep. Adam Schiff is working on legislation to better monitor data centers’ impacts on grids, water, and local communities – potentially including transparency rules and planning requirements.

States are experimenting even faster. In Wisconsin, Democratic lawmakers have pitched statewide standards for data centers, pushing for consistent rules on energy use, water, and noise so that local governments aren’t negotiating from scratch on each project.
Michigan recently approved “landmark regulations” that shift more upfront costs – like grid connections and environmental assessments – onto data center developers instead of ratepayers, a sign that states may be growing wary of open-ended subsidies.
In the UK, a House of Commons research briefing frames data centers as critical national infrastructure, but also notes that they already consume about 2.5% of UK electricity and could see consumption quadruple by 2030. The briefing highlights how high power prices and grid constraints limit growth, and how planning reforms are giving data centers fast-track status in “AI growth zones.”
China, meanwhile, is turning the issue into a geopolitical tool, reportedly banning foreign AI chips in state-funded data centers to bolster domestic suppliers and tighten control over strategic digital infrastructure.
Closer to the ground, counties and cities are scrambling to catch up. TechPolicy.Press describes how local officials are learning, sometimes painfully, how to negotiate with sophisticated data center developers – hiring their own consultants, pooling knowledge across jurisdictions, and trying to trade tax breaks for more tangible community benefits like road improvements, STEM programs, or guaranteed local hiring.
Stateline and Inside Climate News both show how data centers are rapidly becoming a bipartisan flashpoint. Residents complain about secrecy, land use, and rising electricity bills; politicians who once saw these projects as easy “wins” now face protests and primary challenges.
One big source of distrust: non-disclosure agreements. NBC News has reported on deals where tech companies and utilities sign NDAs around large data center projects – complete with code names – leaving local residents and even some local officials in the dark about how much power and water is being committed for decades.
In Latin America, The Guardian documents how communities in Chile and Uruguay have gone to court to pry loose basic information about water use and diesel backup at Google data centers. A Mozilla fellow interviewed there describes a pattern: national governments court data center investment with tax exemptions and deregulatory moves, while local communities struggle just to learn how much water will be diverted from already drought-stressed systems.
Canadian Socialist Project goes further, framing “data center resistance” as part of a broader environmental justice struggle against energy-intensive infrastructure that primarily benefits multinational firms.
Faced with this pushback, tech companies are trying to show they can be part of the solution rather than just another big industrial polluter.
Reuters reports that Big Tech has become a dominant buyer of long-term carbon removal, locking in large volumes of credits and contributing to a supply crunch. That appetite underscores how much residual emissions they expect from their operations, even after aggressive renewable energy sourcing.
Meta emphasizes that it works closely with utilities and pays for grid upgrades associated with its data centers, and that its projects have helped bring online around 15 GW of new energy capacity while moving toward water-positive operations by 2030.

ITIF and others argue that with careful siting and grid planning, data centers can accelerate decarbonization by providing stable offtake for renewables and heat sources for district heating.
On the ground, one of the most visible trends is the rise of microgrids. A Reuters feature describes how microgrid capacity in the US is on track to more than double to about 10 GW by the end of 2025, with data centers emerging as the “customer du jour” for behind-the-meter generation and storage. Companies like PG&E and Vantage Data Centers are building microgrids that combine fuel generators, batteries, and solar to provide reliable power and reduce pressure on congested transmission lines – but often with new gas-fired capacity in the mix.
Some more speculative ideas are floating around too. Wired and MarketWatch have both explored proposals for space-based data centers – essentially putting server farms in orbit to reduce land and water impacts on Earth. At this stage, they’re more thought experiment than investment thesis, but the fact that serious engineers are talking about orbital data centers says something about how extreme the resource squeeze has become.
Meanwhile, environmental and public-health experts warn that these green strategies can’t just be marketing gloss. Harvard Business Review points out that if the extra electricity demand for AI data centers is met with fossil generation, the resulting air pollution could impose large health costs, especially on communities near new gas plants and peaker units.Harvard Business Review+1
Advocacy organizations like CIEL warn that pairing AI-driven demand with fossil plants plus carbon capture runs the risk of “greenwashing” fossil expansion – locking in long-lived gas infrastructure under the guise of serving the digital economy.
Research in Nature suggests one path forward: firms that participate in strategic environmental alliances – formal partnerships with civil society groups focused on environmental performance – tend to see better green innovation outcomes over time.
The question is whether data center developers will lean into that kind of partnership model… or keep fighting trench warfare project by project.
If you listen to state economic-development pitches, data centers are sold as engines of growth. And to be fair, they really do bring money:
Construction phases can run into the billions of dollars, supporting thousands of short-term jobs in steel, concrete, electrical work, and heavy equipment.
Local tax bases can get a big boost – from property taxes, utility franchise fees, and occasionally new sales-tax revenue tied to construction.
Meta, for instance, stresses that its US data center investments have supported over 30,000 skilled-trade jobs, brought more than $20 billion in business to subcontractors, and generated thousands of operational roles.
But the long-term employment picture is much murkier.
The House of Commons briefing on UK data centers notes that they are “highly automated” and that even a £10 billion campus in Northumberland is expected to create only about 400 full-time onsite jobs.

A Rest of World investigation into data centers in Chile found something similar:
Government and corporate messaging promised tens of thousands of jobs associated with new Microsoft and Google data centers.
Permitting data for 17 data centers undergoing environmental review showed no more than 1,547 full-time operations jobs across all the projects – an average of about 90 per site. Most of those roles were in security and cleaning rather than high-paid technical positions.
Even as Chile positions itself as a regional data center hub, residents interviewed in working-class neighborhoods reported few direct job postings and modest local economic spillovers, despite billions in projected foreign investment.
In the US, a local analysis commissioned by Janesville, Wisconsin – one of several Midwestern communities now wooing data center projects – frames the trade-off in similar terms: big upfront capital, limited ongoing employment, and heavy pressure on utilities and infrastructure.
Brookings sums up the picture neatly: data centers underpin a huge amount of digital economic activity, but the direct jobs are modest compared to the capital deployed, which makes it crucial to ensure broader community benefits through training pipelines, supplier programs, and infrastructure investments.
On the energy side, the stakes are enormous – and increasingly visible in rate cases.
Dr. Severin Borenstein’s piece for the Energy Institute at Haas asks the question everyone’s thinking: what will data centers do to your electric bill? The answer: it depends on the regulatory plumbing. If data center operators pay for most grid upgrades and new generation, and if they bring in additional tax revenue, they can actually help spread fixed costs and lower rates for other customers. But if utilities socialize the costs of speculative data center loads – building plants for projects that never materialize, or structuring rates so residential customers pay for most of the upgrades – then bills will go up.
Business Insider’s “AI power bubble” reporting shows how that risk can materialize: developers often file interconnection requests with multiple utilities for the same project, knowing only one site will ultimately be chosen. Each utility, unaware of the others’ plans, may plan new plants or transmission lines based on that load. If only one project actually gets built, the other regions could be left with under-utilized plants whose costs still end up in ratepayers’ bills.
Columbia University’s Center on Global Energy Policy urges regulators to take a more strategic, data-driven approach – integrating DOE scenarios on data center demand into grid planning and designing tariffs that better match costs to beneficiaries, so that ordinary households don’t end up subsidizing speculative AI loads.
Inside Climate News notes that voters are already blaming data centers for rate hikes in several states, and that regulators in nearly 30 states have opened dockets or legislative processes specifically focused on managing large new loads like data centers.
Water is the other pressure point. Brookings’ deep dive on “AI, data centers, and water” points out that data centers often land in already stressed water systems, especially in the American West and Southwest, adding demand on aging infrastructure that already needs hundreds of billions of dollars in upgrades.
In Northern Virginia – the world’s largest data center cluster – Inside Climate News reports that proposals for massive new campuses with power demands in the hundreds of megawatts have triggered backlash not just over land use but over the knock-on effects on power plant siting and water withdrawals.

Internationally, the pattern repeats:
In Germany, coverage from DW describes local resistance to Google’s data center plans in water-stressed regions.
In South Korea, Chosun reports on a surge in data center construction and associated concerns around grid capacity and urban land prices.
In Southeast Asia, a Fortune piece explores how “small AI” approaches – more modestly sized data centers and constrained model sizes – are emerging partly because of power and water limits.
Across Latin America, The Guardian and Rest of World both show communities in Chile, Uruguay, and Brazil demanding transparency on water and energy use and questioning whether generous tax breaks are worth the trade-offs.
All of this feeds into the broader “AI bubble” debate. NPR, Wired, and others warn that the revenue from AI applications may not keep up with the cost of building out GPUs, networks, and data center shells – a replay of the fiber-optic glut from the dot-com era, but with much bigger energy and climate stakes.
On the other hand, Business Insider’s “this AI bubble argument is wrong” article points out that even if some valuations deflate, the physical infrastructure – GPUs and data centers – can stay productive for years, especially as older hardware shifts to less demanding workloads.
And that’s really the crux: even if the financial hype cools, the concrete, substations, and grid upgrades will still be there. The question is whether they end up being the useful backbone of a more productive economy… or just another round of stranded assets.
So what does the Great Build-Out look like if we zoom out a bit? The four expert perspectives you supplied offer a helpful way to think about the next decade.
Prof. Carl Benedikt Frey, Dieter Schwarz Associate Professor of AI & Work, Oxford Internet Institute, Director, Future of Work & Oxford Martin Citi Fellow, Oxford Martin School at the University of Oxford, the author of How Progress Ends: Technology, Innovation, and the Fate of Nations, draws a direct line between today’s AI infrastructure spree and the fiber-optic boom of the late 1990s:
“The rush to build ever more data centers reflects a simple economic reality: today’s AI boom is extraordinarily capital-intensive. Training and running large models requires vast amounts of computing power, so tech companies are racing to secure capacity before their rivals do – much as firms did with fiber-optic cable during the late-1990s internet bubble. The difference is that, so far, the productivity gains from AI look more muted than the hype suggests, while the macroeconomic risks – from financial leverage to energy demand – are more pronounced. We may again be laying down essential infrastructure, but we should not assume the returns will automatically justify the trillions now being spent.”
His warning lines up with what we see in financial coverage: Reuters flags AI data centers as a new “debt hotspot”, as companies and utilities finance multi-billion-dollar projects with long maturities and uncertain cash flows. Bloomberg points to rising strains on grids, water resources, and capital markets that look less like a smooth tech upgrade and more like a stress test of infrastructure and regulation.
Prof. Frey also zeroes in on the regional stakes:
“For states like Wyoming, the key question is not just how many data centers are built, but on what terms. These facilities bring construction jobs, some high-skilled positions, and a larger tax base. Yet they are also voracious consumers of land, water, and electricity, and most of the economic rents may accrue to distant shareholders rather than local communities. The challenge for policymakers is to ensure that generous subsidies and infrastructure upgrades translate into durable local benefits – in the form of resilient grids, workforce development, and broader diversification.”
That point is echoed almost everywhere – from the House of Commons briefing in the UK to Rest of World’s Chile reporting to the Janesville memo in Wisconsin: without carefully crafted deals, most of the upside goes to global firms; most of the risk sits with local grids, landscapes, and residents.
Dr. Michael Mandel, Chief Economist and Vice President at Progressive Policy Institute, by contrast, leans into the idea that this build-out is more like building railroads than building Pets.com:
“We’ve gone through a long period where “physical” industries such as agriculture, construction, manufacturing, and much of mining have stagnated compared to digital industries. This stagnation in physical industries has especially hurt states such as Wyoming, which has barely grown since 2019.
AI has the potential to transform physical industries, boosting productivity and incomes and opening up new markets. AI will be especially beneficial to states such as Wyoming, which has shown no productivity growth over the past 15 years.
The growth of AI requires investment in large-scale data centers. Data centers are necessary, both to train the underlying models and to power the applications. This investment is no different, conceptually, from laying down rails for trains or drilling for oil. You need to spend on technology to get the benefits of technology, especially when dealing with the complications of the real world.
Indeed, China is pouring hundreds of billions into advanced technology industries, including AI. In this context, the US wave of data center construction and grid modernization looks like a necessity rather than an optional choice.”

Looked at through Dr. Mandel’s lens, the Columbia and ITIF analyses make sense: if you assume that AI really will improve productivity in manufacturing, mining, logistics, and agriculture, then not building data centers is the bigger risk – and the policy challenge becomes how to align grid, water, and workforce investments with that trajectory. Dr. Mandel also explored the politics of data centers. He recently published “An AI Innovation Toolbox for Governors.”
The danger is that “necessity” easily becomes a blank check. As the Energy Institute blog notes, even if the overall investment is justified, there is still plenty of room to get the cost allocation wrong and stick households with higher bills to subsidize corporate balance sheets.
Daniel Bresette, President of Environmental and Energy Study Institute (EESI), offers a kind of pragmatic middle ground:
“Some environmental concerns about data centers stem from their energy and water consumption, which can put added strain on local infrastructure. And in the case of energy, demand from data centers that is met with non-renewable sources could lead to more pollution and greenhouse gas emissions. Investments in renewable energy and energy efficiency come to mind as cost-effective options to address these concerns. So does more transparency about the real and potential environmental impacts of data centers to encourage better siting and decision-making. Policymakers at the federal, state, and local levels are undoubtedly trying to learn as much as possible about environmental impacts from data centers. The tech industry moves fast, and nobody setting policy or crafting regulations wants to be in a position of constantly trying to catch up.”

(Rodrigo Arangua / AFP / Getty Images)
His emphasis on transparency and siting matches what we see on the ground:
Brookings and the Wired piece on “where to build data centers to keep emissions down” both stress that location matters: building in cool climates with clean grids and robust water systems can dramatically reduce the environmental footprint per teraflop.
The Guardian’s Latin America reporting and Inside Climate News’ coverage of US opposition show that secrecy and vague promises are exactly what turn communities against projects; detailed, verified data on water, diesel, and grid impacts are the starting point for trust.
Bresette’s call for investments in renewables and efficiency also fits with the microgrid trend and with Meta’s and others’ use of long-term power purchase agreements to add new carbon-free generation. But, as CIEL and HBR warn, those tools have to genuinely displace fossil generation, not just layer nice-sounding offsets on top of a growing fleet of gas plants.
Finally, Dr. Ermengarde Jabir, a Senior Economist at Moody’s Analytics, brings the commercial-real-estate view – basically, where do these things go, and what do they do to local economies?
“Building in urban cores, particularly in central business districts near offices, is no longer absolutely essential for most data center operations given that latency issues are not as prevalent as they once were given rapidly accelerating technological developments. Thus, there is more opportunity to build on less expensive land outside of cities. However, there is of course the need to consider existing infrastructure (access to power and cooling sources), which doesn’t always make building in remote areas viable. The tradeoff when building in cities then becomes competition for land between more essential real estate uses, such as housing, which is facing tremendous affordability pressures, and more profitable endeavors in the immediate term, like data centers.
While there is a lot of NIMBYism around data center development, municipalities that have had their tax bases adversely affected by obsolescence in office and retail are actively courting data center development via incentives that include expedited permitting. There is also the topic of geography, where if one municipality restricts additional data center construction due to public concern, neighboring municipalities will usually make themselves amenable to development. This is the case at the moment in Northern Virginia, where development is now moving further south and west in the state as well as across the Potomac to Southern Maryland.
Generally speaking, data centers are not job creators, at least not in the long term. There are the construction jobs created in the immediate term to develop the data center. Once the data center is operational, the few jobs that are onsite are typically for security, as maintenance of data centers is usually performed by specialized teams that travel to the data center.”
Her description matches the data from the UK briefing, Rest of World’s Chile case study, and local reporting in Virginia: data centers chase cheap land and available power, then leapfrog to neighboring jurisdictions when one area pushes back.
It also lines up with the lived experience in Northern Virginia, where Inside Climate News traces the evolution from a handful of facilities in the 1990s to today’s sprawling “data center capital of the world,” and where residents now describe feeling boxed in by high-voltage lines, substations, and low-slung concrete boxes.

Jabir’s bottom line – that these are not long-term job engines – is also backed by the UK government’s GVA and job estimates and the Chilean permit filings.
Put all of this together, and the Great Build-Out looks less like a simple “good or bad” story and more like a design problem.
The infrastructure is probably coming either way. AI workloads are growing; China and other rivals are racing to build their own data center backbones; Big Tech has already committed to hundreds of billions in capex.
The real questions are on what terms and with which safeguards:
Who pays for the grid? Stronger rules like Michigan’s – requiring developers to shoulder more upfront costs – can reduce the risk that ordinary ratepayers subsidize expensive new plants that mainly serve a few hyperscale tenants.
Where do we put them? Following the guidance from Brookings and Wired – prioritizing cleaner grids, cooler climates, and water-secure regions – can slash environmental impacts per unit of compute.
What do communities get back? Community benefits agreements, transparency on water and power use, and long-term commitments to local infrastructure and workforce development are key if towns aren’t going to feel like sacrifice zones. Examples from Chile, Latin America, and Northern Virginia show what happens when that social contract breaks.
How do we manage risk? Regulators and investors need to treat data centers like the heavy industry they are – with stress tests for debt, realistic demand scenarios, and clear plans for decommissioning or repurposing facilities if AI demand doesn’t live up to the hype.
For a state like Wyoming, that might mean something like this:
Welcome data centers – but tie tax incentives to verifiable local benefits, like grid upgrades that improve reliability for everyone, not just the campus; training programs for local workers; and genuine diversification into AI-enabled services and manufacturing.

Insist on full transparency about energy, water, and emissions, including who pays for which upgrades and which emissions are being offset versus actually reduced.
Coordinate with neighboring states and utilities to avoid the “race to the bottom” where companies play jurisdictions off one another for the cheapest energy and weakest rules, leaving communities stuck with stranded plants and higher bills.
The Great Build-Out isn’t just about servers and GPUs. It’s the next big test of whether we can build critical, carbon-intensive infrastructure in a way that actually shares the gains, limits the harms, and prepares economies like Wyoming’s for a more automated, electrified future.
If we get it right, today’s noisy construction sites could look, in hindsight, like the early days of the interstate highway system or the rural electrification push – messy, expensive, but ultimately transformative. If we get it wrong, they’ll look more like the overbuilt fiber of the dot-com bubble: useful, sure, but bought at a price that many communities will still be paying for decades.










The latest news in your social feeds
Subscribe to our social media platforms to stay tuned