Text / Chat

o3-pro Environmental Impact

Very heavyEstimated

Extended reasoning version of o3 for complex problems

Architecture
Transformer with extended chain-of-thought reasoning
Context
200,000 tokens
Provider
OpenAI
35.0 Wh
Energy per query
15.3 g
CO₂ per query
130 mL
Water per query
117x more than
vs Google search

Energy per query

35.0 Wh

117x more than a Google search (0.3 Wh)

CO2 per query

15.3 g

US East (Virginia) grid (450 gCO₂/kWh)

Water per query

130 mL

~8 queries to fill 1 litre

Processing location

Azure / Multi-cloud

Provider

OpenAI

Category

Text / Chat

Grid carbon intensity

450 g CO2/kWh (25% renewable)

How does o3-pro compare?

Ranked #132 of 152 models by energy per query

0 Wh9 Wh18 Wh27 Wh36 WhLLaMA 3.2 1BGemini 1.5 ProGPT-4.1 Nanoo3-proGoogle search (0.3 Wh)

Detailed Breakdown

Energy Consumption

o3-pro uses significantly extended chain-of-thought reasoning — potentially running for minutes on a single query. At ~35 Wh per complex query, it is one of the most energy-intensive models per-query. It achieves the highest scores on mathematical and scientific reasoning benchmarks but at enormous computational cost.

Power Source & Carbon

Runs on dedicated high-compute GPU clusters. The extended thinking time means sustained GPU utilisation far beyond typical query patterns.

Water Usage

At ~130 mL per query, o3-pro consumes roughly a small glass of water per complex reasoning task.

About o3-pro

o3-pro is a text and chat model from OpenAI, released in June 10, 2025. Extended reasoning version of o3 for complex problems. Each query uses 35.0 Wh of energy and produces 15.3 g of CO₂. That's 117x the energy of a Google search — reflecting the computational demands of text and chat.

These figures are estimates derived from hardware specifications and API benchmarks — OpenAI has not published official energy data for o3-pro. Actual consumption may vary significantly depending on batching, quantisation, and infrastructure optimisations that we cannot observe from outside.

o3-pro in Context

100%
potential savings

The efficiency alternative

Gemini Nano performs the same type of task using just 0.01 Wh per query — 100% less energy than o3-pro. For a user sending 25 queries per day, switching would save 319.2 kWh per year.

3500 MWh
estimated daily

At global scale

With an estimated 10M+ daily users averaging 10 queries each, o3-pro consumes roughly 3500 MWh of electricity per day — enough to power 116667 homes.

319.4 kWh
per year

Your yearly o3-pro footprint

At 25 queries per day, your annual o3-pro usage consumes 319.4 kWh — a meaningful fraction of household electricity. That produces 139.6 kg of CO₂.

Key Insights

Uses 11x more energy than the category average — reasoning models are inherently compute-intensive

OpenAI o3 Family

What does your o3-pro usage cost the planet?

Use our calculator to estimate your personal environmental footprint based on how often you use o3-pro.

Calculate My Compute

Frequently Asked Questions

How much energy does o3-pro use per query?

Each o3-pro query consumes approximately 35.0 Wh of energy. This is 117x more than a traditional Google search (~0.3 Wh).

What is o3-pro's carbon footprint?

Based on the carbon intensity of Azure / Multi-cloud, each query produces approximately 15.3 g of CO2. The grid in this region has a carbon intensity of 450 g CO2/kWh with 25% renewable energy.

How much water does o3-pro use?

Each query consumes approximately 130 mL of water, primarily used for cooling the data centers that process the request.

How does o3-pro compare to a Google search?

A o3-pro query uses 117x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while o3-pro uses 35.0 Wh.

Technical Details

Architecture

Transformer with extended chain-of-thought reasoning

Context window

200,000 tokens

Release date

2025-06-10

Open source

No

Training data cutoff

2025-05