Global Environmental Impact of AI
AI is reshaping global energy demand. Data centres consumed 415 TWh in 2024 -- projected to reach 945 TWh by 2030, equivalent to Japan's entire electricity consumption.
How AI Compares
AI's carbon footprint is growing faster than any other tech sector. Here's how it stacks up against familiar benchmarks.
vs Global Aviation
AI emissions are ~13% of global aviation (2024 est.), but growing much faster
vs Bitcoin Mining
AI estimated to emit ~1.7x more CO2 than Bitcoin mining (both figures carry uncertainty)
vs Video Streaming
AI inference estimated at ~2x more than all global video streaming
AI Emissions Over Time
AI workloads emitted an estimated 120 Mt CO₂ in 2024 — more than Belgium's entire annual emissions. By 2030, that could reach 300 Mt on current trajectory.
- AI Emissions (Mt)
- All Data Centers (Mt)
Source: IEA, Goldman Sachs, Ember Climate. Dashed line marks the present — everything after is projected.
Projected Energy Demand
Data center electricity is projected to reach 945 TWh by 2030 — a 165% increase in six years.
- AI Share (TWh)
- Total Data Center (TWh)
Source: IEA, Goldman Sachs projections
Where the Energy Goes
Inference — answering queries in real-time — dominates AI's energy footprint.
Training gets the headlines, but inference is the iceberg beneath. Every question you ask an AI contributes to the 85% slice.
Country Carbon Intensity
The same AI query can produce up to 35x more CO₂ depending on data center location. Sorted cleanest to dirtiest.
Future Paths
AI emissions through 2030 depend on policy, technology, and energy choices. Explore four possible scenarios.
Current trajectory with moderate efficiency gains.
The Cost of Training
Training a single frontier model now produces as much CO₂ as thousands of transatlantic flights.
Flights = NY-London return. Sources: Strubell et al., Patterson et al., industry estimates.
Provider Transparency
Corporate emissions may be 7.62x higher than reported due to market-based accounting.
Published per-query energy data for Gemini. 33x efficiency gain in one year.
Microsoft
HighCarbon negative by 2030 goal. Launched zero-water cooling designs.
Amazon (AWS)
MediumLargest corporate renewable buyer 5 years running.
Meta
MediumPursuing nuclear power for data centers.
OpenAI
LowNo dedicated sustainability report as of 2026.
Anthropic
LowClaude scored highest eco-efficiency on AWS. No emissions data published.
What's Being Done
The picture isn't all bleak. Efficiency gains, clean energy deals, and regulation are shaping a better trajectory.
33x Efficiency Gain
Google achieved a 33x reduction in Gemini's per-query energy (May 2024 - May 2025) through hardware upgrades, distillation, and inference optimisation.
Mixture-of-Experts
MoE architectures like DeepSeek-V3 activate only ~5% of parameters per query, dramatically cutting computation per inference.
Nuclear Renaissance
Three Mile Island may reopen in 2028 for Microsoft. Amazon and Google are also buying dedicated nuclear capacity.
Immersion Cooling
Servers submerged in dielectric fluid achieve PUE of 1.03 vs the 1.3 industry average — cutting cooling energy by up to 95%.
EU Regulation
The EU Energy Efficiency Directive mandates PUE reporting for data centers >500 kW — the first major DC regulatory framework.
Cost Efficiency
Inference costs at GPT-3.5 level dropped 280x in two years (Stanford AI Index), making AI accessible while reducing per-query impact.
Infrastructure Evolution
Key hardware, grid, cooling, and policy milestones shaping AI's emissions trajectory.
First tensor core GPU; enabled large-scale AI training. 300W TDP.
2.5x training efficiency over V100. Became the workhorse of AI data centers.
3x inference throughput over A100 with transformer engine. 700W TDP.
4x inference throughput over H100; ~2.5× better perf/watt. 1000W TDP (up from 700W).
Google TPU v6e, AWS Trainium2: 2–3x better perf/watt than general GPUs for transformer workloads.
Tech companies become largest corporate buyers of renewable energy globally.
Virginia (largest DC market) faces power shortages. Ireland DCs consume 21% of grid.
Microsoft signs Three Mile Island restart deal. Amazon/Google buy nuclear capacity.
Some providers building dedicated gas plants, undermining climate goals.
Small modular reactors targeted for DC-adjacent deployment by 2028–2030.
Direct-to-chip cooling reduces energy use 30–40% vs air cooling for high-density racks.
New designs eliminate evaporative cooling, drastically cutting water consumption.
Servers submerged in dielectric fluid; PUE approaches 1.03 (vs 1.3 industry avg).
Mandates PUE reporting for data centers >500 kW. First major DC regulation.
Includes environmental reporting requirements for large AI training runs.
Projects data centers could consume 3–4% of global electricity by 2030 (up to ~1,300 TWh).
Sources: Stanford AI Index, SemiAnalysis, Uptime Institute, IEA
Sources & References
24 sources cited on this page. All data points are linked to their original source.
Curious about your own impact?
Calculate the energy, carbon, and water cost of your AI usage.
Calculate My Compute