o3 (Reasoning) Environmental Impact
Advanced reasoning model — very high energy use
- Architecture
- Transformer with chain-of-thought reasoning (decoder-only)
- Context
- 200,000 tokens
- Provider
- OpenAI
Energy per query
21.4 Wh
71x more than a Google search (0.3 Wh)
CO2 per query
10.0 g
US East (Virginia) grid (450 gCO₂/kWh)
Water per query
100 mL
~10 queries to fill 1 litre
Processing location
Azure US East / Sweden
Provider
OpenAI
Category
Text / Chat
Grid carbon intensity
450 g CO2/kWh (25% renewable)
How does o3 (Reasoning) compare?
Detailed Breakdown
Energy Consumption
The o3 reasoning model consumes approximately 21.4 Wh per query — about 71x more energy than a Google search. Reasoning models perform multiple internal inference passes ("chain-of-thought" steps) before producing a final answer, which multiplies GPU compute time significantly compared to standard chat models.
Power Source & Carbon
Runs on Microsoft Azure's infrastructure, primarily in the US East region. The Virginia data center corridor, where many Azure facilities are concentrated, has a carbon intensity of approximately 450 g CO2/kWh — significantly above the US average of 380 g CO2/kWh — due to reliance on natural gas and remaining coal generation.
Water Usage
With ~100 mL per query, o3 consumes roughly as much water as a small glass of water for each interaction. This high water usage is a direct consequence of the extended GPU runtime required for multi-step reasoning.
About o3 (Reasoning)
o3 represents a fundamentally different kind of AI energy consumption. Unlike standard chat models that respond in under a second, o3 'thinks' — running extended chains of internal reasoning that can take 30 seconds or more. That deliberation is what makes it capable of PhD-level maths and complex code generation, but it comes at a staggering 50-100x energy premium over models like GPT-4o. It is the clearest example of a tradeoff that will define AI sustainability: the most capable models are often the most expensive to run, and the gap is widening.
o3 (Reasoning) in Context
The efficiency alternative
Gemini Nano performs the same type of task using just 0.01 Wh per query — 100% less energy than o3 (Reasoning). For a user sending 25 queries per day, switching would save 195.3 kWh per year.
At global scale
With an estimated 10M+ daily users averaging 10 queries each, o3 (Reasoning) consumes roughly 2141 MWh of electricity per day — enough to power 71380 homes.
Your yearly o3 (Reasoning) footprint
At 25 queries per day, your annual o3 (Reasoning) usage consumes 195.4 kWh — a meaningful fraction of household electricity. That produces 91.3 kg of CO₂.
Key Insights
OpenAI o3 Family
How energy efficiency has evolved across versions.
What does your o3 (Reasoning) usage cost the planet?
Use our calculator to estimate your personal environmental footprint based on how often you use o3 (Reasoning).
Calculate My ComputeFrequently Asked Questions
How much energy does o3 (Reasoning) use per query?
Each o3 (Reasoning) query consumes approximately 21.4 Wh of energy. This is 71x more than a traditional Google search (~0.3 Wh).
What is o3 (Reasoning)'s carbon footprint?
Based on the carbon intensity of Azure US East / Sweden, each query produces approximately 10.0 g of CO2. The grid in this region has a carbon intensity of 450 g CO2/kWh with 25% renewable energy.
How much water does o3 (Reasoning) use?
Each query consumes approximately 100 mL of water, primarily used for cooling the data centers that process the request.
How does o3 (Reasoning) compare to a Google search?
A o3 (Reasoning) query uses 71x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while o3 (Reasoning) uses 21.4 Wh.
Technical Details
Architecture
Transformer with chain-of-thought reasoning (decoder-only)
Context window
200,000 tokens
Release date
2025-01-31
Open source
No
Training data cutoff
2024-10