Data centers currently use about 1.5% of global electricity. By 2030, that number could hit 3%. Maybe higher.
That might not sound like much until you realize we're talking about the electricity consumption of entire countries—and it's all happening while the world is supposedly trying to reduce emissions.
I've been digging into the numbers on AI's energy footprint, and the more I learn, the more concerned I get. Not in a "robots are taking over" way, but in a very practical "we might not have the infrastructure to support this" way.
The Scale Is Hard to Grasp
Let me put some numbers in context:
Training GPT-4 required roughly 30 megawatts of power. That's enough to power about 20,000 homes. For one training run.
OpenAI's planned Stargate infrastructure would consume 10 gigawatts—as much electricity as New York City uses on a hot summer day.
And training is just the beginning. Every ChatGPT query, every Midjourney image, every AI video generation—all of that requires compute. All of that requires electricity.
The International Energy Agency projects that data center electricity consumption will more than double by 2030, reaching around 945 terawatt-hours. For reference, that's slightly more than Japan's total electricity consumption.
Let me say that again: AI's projected energy usage in five years equals an entire industrialized nation.
Why AI Is Different
People keep comparing this to previous tech booms, but AI is fundamentally different from cloud computing or streaming.
A data center running traditional workloads might use 10-20 kilowatts per rack. An AI data center can use 100+ kilowatts per rack. That's 5-10x more power density in the same physical space.
And unlike your laptop that you close at night, AI data centers run 24/7/365. They can't rely on intermittent renewable energy. Solar and wind are great, but they don't work when it's dark and calm. AI needs constant, reliable power.
This is why we're seeing AI companies striking deals with nuclear power plants. Constellation Energy just signed a 20-year power purchase agreement with Meta. Nuclear is the only carbon-free baseload power that can run continuously.
But here's the problem: we're not building nuclear plants fast enough to keep up with demand. Not even close.
The Fossil Fuel Problem
A Goldman Sachs analysis estimates that about 60% of the increasing electricity demand from data centers will be met by burning fossil fuels.
Let me be clear about what that means: AI is going to increase global carbon emissions by roughly 220 million tons. For context, driving a gas-powered car for 5,000 miles produces about 1 ton of CO2.
So we're talking about the equivalent of 220 million cars driving 5,000 miles. From AI. Per year.
This directly contradicts the climate pledges that tech companies made just a few years ago. Google committed to carbon-free energy 24/7 by 2030. Microsoft pledged to be carbon negative by 2030.
Neither of those targets look remotely achievable now. Microsoft's greenhouse gas emissions increased 30% last year, primarily due to AI. Google hasn't been much better.
The companies building AI are the same ones who marketed themselves as climate leaders. That tension is getting harder to ignore.
The Water Problem Everyone's Ignoring
It's not just electricity. Data centers use enormous amounts of water for cooling.
The Great Lakes region is seeing a flurry of data center development because of available water. But the Great Lakes are a shared resource—eight U.S. states and two Canadian provinces all draw from them.
How much more water can they provide for data centers and the power plants needed to run them?
In Georgia, residents reported well water problems after a nearby data center was built. That's the kind of local impact that doesn't make headlines but affects real people.
And nobody's being transparent about the scale. Tech companies worry that sharing water usage data gives rivals a competitive edge, so they don't disclose it. Which means communities can't make informed decisions about data center development in their area.
I don't know, maybe I'm old-fashioned, but I think if you're going to use a community's resources, people should at least know how much you're taking.
Where This Gets Political
Here's where things get messy: AI energy demand is colliding with energy policy in ways that make everyone uncomfortable.
Utilities are facing unprecedented demand. Some regions are projecting huge increases that would require massive infrastructure build-outs. But they're also getting shopping around—AI companies are taking the same big projects to multiple utilities to see who can deliver power fastest.
This creates a weird dynamic where utilities aren't sure if the demand is real or if they're being used as leverage. GridUnity CEO Brian Fitzsimons noted they're "starting to see similar projects that look exactly to have the same footprint being requested in different regions across the country."
Are we building infrastructure for projects that won't happen? Are we not building infrastructure for projects that will?
And then there's the political dimension. More than 90% of projects waiting for grid connection are solar, wind, or battery storage—the fastest to deploy. But the current political environment in the U.S. is hostile to renewables. So what gets built?
If we can't build enough renewable energy, and we can't build nuclear fast enough, then what? More natural gas plants? Coal?
The Community Opposition
Google recently canceled a data center project after community opposition. When the company confirmed it was pulling the proposal, residents literally cheered.
Similar fights are happening across the U.S. Communities are pushing back on data centers because:
- They consume massive amounts of water and electricity
- They deliver few local jobs (they're highly automated)
- They can drive up electricity prices for residential users
- They increase strain on local infrastructure
From a local government perspective, data centers are a weird deal. They pay property taxes (good), but they don't employ many people, they stress infrastructure, and residents hate them (not good).
The NIMBY response to data centers could genuinely slow AI development. If communities keep rejecting projects, where do they get built?
The Efficiency Argument Falls Apart
AI optimists keep pointing to efficiency improvements. Better chips, better algorithms, better cooling systems—all of which will reduce energy consumption per operation.
And they're right! Nvidia's new GPUs use 25x less energy than previous models for the same workload. That's genuinely impressive.
But here's the thing about efficiency gains: they get eaten by increased usage. This is called Jevons Paradox, after a 19th-century economist who noticed that as steam engines became more efficient, Britain's coal consumption actually increased.
More efficient AI doesn't mean less energy usage. It means more AI.
When GPT-4 got cheaper to run, people didn't use it the same amount—they used it way more. The efficiency gains got completely overwhelmed by increased usage.
So yeah, we're getting better at energy-efficient AI. But we're also deploying AI at exponentially increasing rates. The math doesn't work out in our favor.
What Actually Needs to Happen
The solutions exist, but they require coordination and political will:
Geographic optimization: Build data centers where renewable energy is abundant. Iceland, Pacific Northwest, etc.
Workload shifting: Cloud providers moving AI tasks to wherever renewable energy is available at any given time.
Nuclear revival: Seriously investing in next-generation nuclear as baseload power for AI.
Demand management: Actually asking whether every use case needs cutting-edge AI or if older, less energy-intensive models would work.
Transparency requirements: Mandating that companies disclose energy and water consumption so communities can make informed decisions.
The problem is these solutions require things like "long-term planning" and "prioritizing sustainability over speed" and "accepting that maybe some AI development should slow down," which are not exactly Silicon Valley's strong suits.
My Extremely Unpopular Take
I think we need to have a serious conversation about whether all AI development should happen at the pace it's currently happening.
Not because AI isn't useful—it clearly is. But because the infrastructure required to support it is massive, the environmental impact is real, and we're making trillion-dollar bets on infrastructure before we've figured out if the demand is actually there.
Some AI workloads are genuinely transformative. Medical research, climate modeling, materials science—these could justify significant energy usage because the benefits are enormous.
But does every startup need to train its own foundation model? Does every company need AI in every product? Does every query need the most powerful model, or would a smaller, more efficient one work fine?
Right now we're in this weird phase where "AI" is a magic word that justifies any resource usage. Data center needs 10 gigawatts? It's for AI, so that's fine. Need to build a new natural gas plant? AI demands it.
At some point we might want to, I don't know, prioritize? Maybe have some framework for deciding which AI applications are worth the environmental and infrastructure costs?
The Timeline Is the Problem
Here's what keeps me up at night: the timeline mismatch.
AI is advancing on a 6-12 month cycle. New models, new capabilities, new use cases.
Energy infrastructure operates on a 5-10 year cycle. Planning, approvals, construction, deployment.
Climate change operates on a "we needed to act 20 years ago" cycle. Every ton of CO2 matters, and we're adding hundreds of millions from AI alone.
These timelines don't line up. We're making irreversible infrastructure decisions based on projected AI demand that may or may not materialize, while simultaneously blowing past climate targets we set when we thought we could decarbonize.
And the kicker is that if we build all this fossil fuel infrastructure to support AI, we're locked in for decades. You can't just shut down a natural gas plant you built in 2026 when better alternatives become available in 2030.
The decisions we make in the next year or two will shape energy systems for the next 20-30 years. That's terrifying, because I'm not convinced we're making these decisions with full understanding of the implications.
We're sprinting toward an AI future without knowing if we can actually power it sustainably. And if we can't, we're going to have to make some very uncomfortable choices about what gets prioritized.
I hope someone smarter than me is figuring this out. Because right now it feels like we're all just hoping it works out.