The Clean Option Nobody Mentions
How AI data center quick approvals quietly shift environmental and energy costs onto communities

When Elon Musk recently announced the mega merger of his SpaceX and xAi entities to build data centers in space, experts weighed in immediately. The idea of launching solar-powered AI data centers into orbit sounds futuristic, but it quietly exposes a more uncomfortable truth closer to home. If we already know how difficult, expensive, and risky it would be to run large-scale computing infrastructure in space, the more important question is why we are failing to build it responsibly on Earth, where the physics are easier, and the tools already exist.
Across the United States, data centers are being approved at a pace that local governments have rarely seen, often with limited scrutiny and sweeping promises about jobs, innovation, and economic growth. What rarely appears in those proposals is a clear explanation of how these facilities will avoid straining power grids, draining water supplies, locking in fossil fuel generation, or shifting long-term costs onto surrounding communities.
This Community Is Powered by You
What started as a small circle has grown into something much bigger, and it’s all because of readers like you.
Every time you forward this email, post it on socials, or bring someone new into the fold, you’re helping build one of the most passionate, independent political communities out there.
Want to keep the momentum going?
Share this newsletter with someone who should be part of this conversation.
Thank you for being here. It means everything.
A harder question than orbital data centers
This absence is not due to ignorance or a lack of technical solutions. Engineers, grid planners, and environmental analysts already know how to power high-demand computing with cleaner energy, reduce water consumption through advanced cooling, and design facilities that work with the grid rather than overwhelm it. Those options are well documented, technically feasible, and increasingly common in smaller or more tightly regulated projects.
The real story is why these safeguards so often disappear before proposals ever reach the public. To understand that, you have to look not at the technology itself, but at the approval system that governs it, and at how that system quietly rewards speed, scale, and silence over long-term accountability.
What Responsible AI Infrastructure Actually Looks Like
Responsible AI infrastructure is not hypothetical, experimental, or unknowable. The technical components required to reduce environmental harm from large-scale computing are already understood and, in many cases, already deployed in limited or well-regulated settings.
The problem is not a lack of technical solutions
At its core, responsible design begins with location. Data centers that minimize environmental impact are sited where the electrical grid is already low-carbon, where water resources are stable, and where ambient temperatures reduce cooling energy requirements. This alone can dramatically lower emissions and water use before a single server is installed.
Power procurement is the next decisive factor. A genuinely responsible facility does not rely on annual renewable accounting that allows fossil fuel use during peak hours while claiming green credentials on paper. Instead, it pairs long-term clean energy contracts with firm generation or storage and schedules flexible computing workloads to run when the grid is cleanest, reducing stress rather than adding to it.
Cleaner power, smarter cooling, and fewer external costs
Cooling design is equally central. Modern direct-to-chip liquid cooling and closed-loop systems can remove heat efficiently without consuming large volumes of water or discharging thermal pollution into nearby ecosystems. These approaches are widely documented, commercially available, and already in use in environments where water constraints are taken seriously.
Perhaps most overlooked is heat itself. Large data centers generate enormous amounts of waste heat, which can be captured and reused for district heating, industrial processes, or nearby buildings when facilities are designed with that purpose from the start. Treating heat as a resource rather than a nuisance can turn a data center from a net burden into a contributor to local energy resilience.
Designing data centers to work with the grid
Finally, responsible facilities are designed as grid partners rather than grid predators. They include demand-response capabilities, on-site storage, and clear curtailment commitments that enable them to reduce load during emergencies rather than forcing utilities to compensate with new fossil fuel capacity.
None of these practices requires breakthroughs in physics or speculative technology. They require something more mundane and more difficult: binding commitments, upfront transparency, and a regulatory process that demands a full view of the system before granting approval.
What Shows Up in Real Proposals
When data center proposals reach local planning boards and zoning commissions, they rarely resemble the responsible model described above. Instead, they are framed narrowly around land use compliance, projected tax revenue, and generalized claims about efficiency, while the most consequential environmental details remain vague or deferred.
Vague language where specifics matter most
Energy use is typically described in aggregate terms that obscure timing and source. Developers often state that facilities will be “powered by the grid” or “supported by renewable energy,” without specifying whether that power will be available during peak demand, whether it requires new fossil generation, or whether clean energy contracts correspond to actual operating hours. These omissions matter because grids do not experience stress annually or abstractly; they experience it hour by hour, during heat waves, cold snaps, and equipment failures.
Water, cooling, and the fine print
Water use is frequently minimized through careful wording rather than clear accounting. Proposals may cite low average consumption without explaining how cooling systems behave during extreme temperatures, drought conditions, or maintenance cycles. In many cases, the most water-intensive scenarios are classified as contingencies and excluded from headline figures presented to the public.
Cooling technology itself is often described at a high level, with references to “industry standard” systems that offer little insight into water dependency, thermal discharge, or adaptability under climate stress. Advanced liquid cooling or closed-loop systems may be mentioned as possibilities, but they are rarely presented as binding design requirements subject to enforcement.
Backup power is another area where detail quietly disappears. Diesel generators are typically treated as emergency infrastructure with minimal discussion of runtime frequency, emissions, or cumulative impact, even in regions where grid instability is already a concern. The assumption embedded in these proposals is that backup systems will remain invisible, despite evidence from existing facilities that they are used more often than advertised.
What never makes it into the application
Perhaps most telling is what is not proposed at all. Commitments to heat reuse, demand-response participation, or enforceable water caps are largely absent, even though these measures are technically feasible and increasingly common in other industrial contexts. Their absence is not explained or justified; it is simply normalized.
What communities are asked to approve, then, is not a comprehensive infrastructure system, but a partial description that treats the most significant impacts as externalities. The proposal satisfies the minimum requirements of the process, while the broader consequences are left for utilities, regulators, and residents to absorb later.
How the Approval System Makes These Omissions Normal
The recurring gaps in data center proposals are not accidental, and they are not the result of a few careless developers. They are produced by an approval system that fragments responsibility across multiple agencies, each empowered to review only a narrow slice of the project.
No single regulator sees the whole system
Local zoning boards are typically tasked with land use, building height, setbacks, and traffic, but not with grid capacity or long-term energy sourcing. Utility commissions focus on reliability and rate impacts, but they are often barred from questioning individual customer demand once a connection request is deemed lawful. Environmental agencies review permits in isolation, assessing emissions, water withdrawals, or stormwater impacts separately rather than as part of an integrated system.
Because no single body is responsible for evaluating the full lifecycle impact of a large data center, developers are rarely required to present it. Each omission is defensible within the scope of a single review, even if the combined effect creates significant environmental and economic consequences.
Why silence moves faster than safeguards
This fragmentation also shapes incentives. A proposal that includes firm clean power commitments, water caps, or heat reuse plans triggers additional scrutiny, coordination, and delay. A proposal that avoids those commitments moves faster, encounters fewer objections, and preserves flexibility for future expansion. Over time, the system teaches developers which details to include and which to leave out.
Public participation does little to correct this imbalance. Hearings are often scheduled late in the process, technical documents are released with limited time for review, and critical questions are deferred to later permitting phases, after zoning or tax approvals are already secured. By the time impacts become concrete, the political leverage of local communities has largely evaporated.
Speed as a political asset
Economic development agencies further reinforce this dynamic by emphasizing job creation and tax base growth while treating environmental safeguards as negotiable add-ons. When projects are framed as competitive wins that might go elsewhere, regulators are encouraged to prioritize speed over scrutiny, even when long-term costs are likely to remain local.
The result is a process that does not require deception to fail the public. It only requires silence. Developers are rewarded for proposing the least constrained version of a project that still satisfies formal requirements, while regulators are shielded from accountability by the limits of their jurisdiction.
In this system, the absence of responsible design is not flagged as a problem. It is treated as the default.
The Kitchen-Table Costs That Never Appear Together
For residents living near proposed data centers, the consequences of these omissions do not arrive as abstract environmental metrics. They show up gradually, through higher utility bills, tighter water restrictions, and public infrastructure costs that were never mentioned during approval hearings.
When grid strain becomes a household expense
Electricity is often the first pressure point. Large data centers draw power continuously, and when that demand coincides with peak usage during heat waves or cold snaps, utilities are forced to respond. In many regions, the fastest response is new natural gas generation or extended operation of existing fossil plants, costs that are spread across all ratepayers rather than assigned to the facility driving the demand.
Water systems absorb what proposals omit
Water impacts are similarly diffuse. Even when a data center’s average water use appears modest on paper, peak cooling demand during extreme heat can strain local systems. Municipalities respond by raising rates, restricting residential use, or investing in expanded treatment and delivery infrastructure, expenses that residents absorb long after the project has been approved.
Noise, air quality, and reliability concerns often follow. Backup generators that were described as rarely used begin operating during grid instability or testing cycles, bringing diesel exhaust and constant mechanical noise into nearby neighborhoods. These effects are technically permitted and individually compliant, even as their cumulative impact becomes impossible to ignore.
Costs that are defensible one at a time
The economic tradeoffs promised at the outset rarely balance these costs. Data centers employ relatively few permanent workers once construction is complete, and many of those positions require specialized skills that are not sourced locally. Meanwhile, tax abatements negotiated to attract the project can limit the public revenue available to address the very infrastructure strain the facility creates.
What makes these impacts difficult to contest is not their severity, but their fragmentation. No single bill, permit, or hearing captures the full picture. Each cost is small enough to defend in isolation, even as the combined burden reshapes local budgets and household finances.
By the time residents recognize that their community is subsidizing someone else’s computing scale, the approvals are final, the infrastructure is sunk, and the leverage is gone. The costs are real, ongoing, and largely invisible at the moment when meaningful choices could have been made.
Why Speed and Scale Are Treated as Virtues
The persistence of this approval model is rooted in how economic success is defined and rewarded. Data centers are seen as symbols of technological leadership and future-facing growth, making their rapid approval a political asset rather than a regulatory concern.
Competition between jurisdictions, not caution
State and local officials often compete with neighboring jurisdictions for investment. In that environment, speed becomes a selling point, and restraint is treated as a liability. Developers understand this dynamic and structure proposals to minimize friction, knowing that any condition perceived as slowing deployment can be portrayed as a reason a project might relocate.
Urgency as a justification for skipping questions
The broader AI arms race intensifies this pressure. As governments and corporations emphasize national competitiveness and innovation leadership, infrastructure that supports large-scale computing is granted an implicit urgency. That urgency discourages questions about whether growth is being pursued in the least harmful way, because delay itself is framed as a strategic failure.
Safeguards reframed as optional
Regulatory agencies are not insulated from this narrative. Staff are often under-resourced, timelines are compressed, and review standards are shaped by past practice rather than emerging scale. When a project resembles what has already been approved elsewhere, it is treated as routine, even if its cumulative impact is unprecedented.
This framing also allows environmental and community safeguards to be recast as optional enhancements rather than baseline requirements. Measures like firm clean power commitments, heat reuse, or enforceable water caps are described as aspirational goals to be pursued later, instead of conditions for approval now. The burden of proof is reversed, placing the responsibility on the public to demonstrate harm rather than on developers to demonstrate restraint.
In this context, silence is not a failure of communication. It is a rational response to a system that rewards velocity over completeness and scale over sustainability. Projects that move fastest are celebrated, while projects that pause to integrate long-term protections are quietly disadvantaged.
As long as speed remains the primary metric of success, responsible design will continue to be treated as a delay rather than a necessity.
The Question That Should Come Before Approval
The debate over AI infrastructure often centers on whether the technology itself is inevitable. That framing misses the more important point. The environmental and economic harms associated with large data centers are not inevitable outcomes of computing demand; they result from choices made or avoided during the approval process.
The costs are not inevitable
We already know how to build high-demand computing infrastructure that reduces grid stress, limits water use, and prevents long-term cost shifting onto surrounding communities. Those designs are not speculative or radical. They are simply more complete than what the current approval system requires.
The reason they rarely appear in proposals is not technical uncertainty, but procedural design. Developers are not asked to prove that their project represents the least harmful option available. Regulators are not empowered to demand a full-system accounting. Communities are not given the information they would need to weigh short-term benefits against long-term obligations.
As a result, the most important decisions are made by omission. What is not proposed becomes what is not debated, and what is not debated becomes what is absorbed later through higher bills, environmental strain, and public infrastructure costs that never appeared on the agenda.
The question that changes the outcome
If there is a single question that communities should insist on before granting approval, it is not whether a data center is innovative or competitive. It is whether the developer can demonstrate, in binding terms, that the project is designed to meet today’s computing needs without quietly transferring tomorrow’s costs to everyone else.
Until that question is required, the cleanest, most responsible options will remain absent from the room, and the consequences will continue to arrive long after the permits are signed.
Support Independent Media
This story exists because someone took the time to follow what gets left out of public approvals. Most coverage never reaches that layer, even though those omissions shape household costs and long-term environmental damage.
The Coffman Chronicle is reader-supported so it can keep doing this work. Paid subscribers make it possible to ask the questions developers are not required to answer.
If this piece helped you understand how decisions are made by omission, consider becoming a paid subscriber. Your support keeps this reporting independent and accountable.
Support independent media. Become a paid subscriber.
Sources:
“AI Data Center” Wikipedia
“U.S. Data Centers Could Consume as Much Water as 10 Million Americans by Decade’s End.” Yale Environment 360, November 17, 2025.
“The Cloud Next Door: Investigating the Environmental and Socioeconomic Strain of Datacenters on Local Communities.” arXiv, June 3, 2025.
“Data Centers as a Driving Force for the Renewable Energy Transition.” Energies 19, no. 1 (2025).
“Drained by Data: The Cumulative Impact of Data Centers on Regional Water Stress.” Ceres, September 23, 2025.
“The Environmental Impact of Chicago Data Centers and Their Sustainability Practices.” DataBank, updated May 6, 2025.



They are putting a huge new data center in Cheyenne Wyoming in the middle of an unprecedented, at least for 8,000 years, drought. When the water stops flowing into the servers and they begin to combust who will be blamed, the city, county, the state or the idiots who sited the damn thing.
The locals will be stuck with exorbitant costs as happened with the energy boom there.