The Global Impact of AI
AI is reshaping global energy demand. Data centers consumed 415 TWh in 2024 — projected to reach 945 TWh by 2030, equivalent to Japan's entire electricity consumption.
Data Center Electricity (2024)
415 TWh
~1.5% of global electricity
More than South Africa's entire annual electricity consumption
Projected 2030
945 TWh
More than doubling in 6 years
Equivalent to Japan's total electricity consumption
US Data Center Water (2023)
64 B liters
Could quadruple by 2028
Enough to fill 25,600 Olympic swimming pools
Daily AI Users
600M
600 million people daily
More than the population of the European Union
Projected Data Center & AI Energy Demand
Source: IEA, Goldman Sachs projections
- AI Share (TWh)
- Total Data Center (TWh)
Carbon Intensity by Region
The same AI query can produce 15x more CO2 depending on where the data center is located.
- Renewable %
- g CO2/kWh
Hardware & Infrastructure Impact
CO2 per GPU Manufactured
200 kg
Equivalent to driving 500 miles
GPU Obsolescence
2.5 years
Rapid hardware turnover
E-waste Recycling Rate
25%
75% of e-waste is not recycled
Inference vs Training
85%
of total AI energy is inference
AI Provider Sustainability Scorecard
Note: Corporate emissions may be 7.62x higher than reported due to favorable accounting (market-based vs location-based).
| Provider | Renewable Energy | Emissions | Water | Transparency | Notes |
|---|---|---|---|---|---|
| 100% matched | +50% since 2019 | +88% since 2019 | High | Published per-query energy data for Gemini. 33x efficiency gain in one year. | |
| Microsoft | 34 GW contracted | +23.4% since 2020 | WUE improved 39% | High | Carbon negative by 2030 goal. Launched zero-water cooling designs. |
| Amazon (AWS) | 100% matched (claimed) | Rising in 2024 | Not disclosed in detail | Medium | Largest corporate renewable buyer 5 years running. |
| Meta | Grid-based matching | Growing with AI investment | Not disclosed in detail | Medium | Pursuing nuclear power for data centers. |
| OpenAI | Not published | Not published | Not published | Low | No dedicated sustainability report as of 2026. |
| Anthropic | Not published | Not published | Committed to water-efficient cooling | Low | Claude scored highest eco-efficiency on AWS. No emissions data published. |
Key Facts
Inference dominates: 85% of AI energy goes to running queries, not training models.
Power demand growth: Goldman Sachs projects 165% increase in data center power demand by 2030.
Fossil fuel reliance: ~60% of global data center energy still comes from fossil fuels (30% coal, 26% natural gas).
Water crisis: US data center water use could quadruple from 17 billion gallons (2023) by 2028.
Efficiency gains: Inference costs at GPT-3.5 level dropped 280x in two years (Stanford AI Index).
Nuclear renaissance: Three Mile Island may reopen in 2028 to power Microsoft data centers.
Where the Energy Goes
Inference — responding to user queries — now accounts for 80–90% of AI's total energy consumption. Every question you ask contributes to the larger slice.
What's Being Done
Google achieved a 33x reduction in Gemini's per-query energy between May 2024 and May 2025, through a combination of hardware upgrades (TPU v5e/v6e), model distillation, and inference optimisation.
Mixture-of-Experts architectures (like DeepSeek-V3) activate only ~5% of their parameters per query, dramatically reducing computation. This architectural shift is one of the most promising avenues for reducing inference energy.
Hardware efficiency improves approximately 2x every 18 months, but total demand grows faster. This is a classic example of the Jevons paradox — as AI becomes more efficient, we use more of it, and total energy consumption continues to rise.
Curious about your own impact?
Calculate your AI footprint →