Code Assistant

GPT-5.2 Codex Environmental Impact

HeavyEstimated

Agentic coding model — 56.4% SWE-Bench Pro

Architecture
Code-optimised Transformer with agentic reasoning
Context
256,000 tokens
Provider
OpenAI
4.0 Wh
Energy per query
1.7 g
CO₂ per query
15 mL
Water per query
13x more than
vs Google search

Energy per query

4.0 Wh

13x more than a Google search (0.3 Wh)

CO2 per query

1.7 g

US East (Virginia) grid (450 gCO₂/kWh)

Water per query

15 mL

~69 queries to fill 1 litre

Processing location

Azure / Multi-cloud

Provider

OpenAI

Category

Code Assistant

Grid carbon intensity

450 g CO2/kWh (25% renewable)

How does GPT-5.2 Codex compare?

Ranked #107 of 152 models by energy per query

0 Wh1 Wh2 Wh3 Wh4 WhCodestralCursorGitHub CopilotGPT-5.2 CodexGoogle search (0.3 Wh)

Detailed Breakdown

Energy Consumption

GPT-5.2 Codex is optimised for autonomous software engineering, achieving 56.4% on SWE-Bench Pro and 64.0% on Terminal-Bench 2.0. At ~4.0 Wh per agentic step, it performs multi-file edits, test execution, and debugging in extended sessions that may span dozens of steps.

Power Source & Carbon

Runs on OpenAI's multi-cloud infrastructure. Extended coding sessions may run for minutes with sustained GPU utilisation.

Water Usage

At ~14.5 mL per step. A full bug-fix session with 10+ steps can consume over 140 mL of water.

About GPT-5.2 Codex

GPT-5.2 Codex is a code assistant model from OpenAI, released in January 14, 2026. Agentic coding model — 56.4% SWE-Bench Pro. Each query uses 4.0 Wh of energy and produces 1.7 g of CO₂. That's 13x the energy of a Google search — reflecting the computational demands of code assistant.

These figures are estimates derived from hardware specifications and API benchmarks — OpenAI has not published official energy data for GPT-5.2 Codex. Actual consumption may vary significantly depending on batching, quantisation, and infrastructure optimisations that we cannot observe from outside.

GPT-5.2 Codex in Context

85%
potential savings

The efficiency alternative

Devstral 2 performs the same type of task using just 0.60 Wh per query — 85% less energy than GPT-5.2 Codex. For a user sending 25 queries per day, switching would save 31.0 kWh per year.

400 MWh
estimated daily

At global scale

With an estimated 10M+ daily users averaging 10 queries each, GPT-5.2 Codex consumes roughly 400 MWh of electricity per day — enough to power 13333 homes.

36.5 kWh
per year

Your yearly GPT-5.2 Codex footprint

At 25 queries per day, your annual GPT-5.2 Codex usage consumes 36.5 kWh — roughly what a fridge uses in a month. That produces 15.9 kg of CO₂.

What does your GPT-5.2 Codex usage cost the planet?

Use our calculator to estimate your personal environmental footprint based on how often you use GPT-5.2 Codex.

Calculate My Compute

Frequently Asked Questions

How much energy does GPT-5.2 Codex use per query?

Each GPT-5.2 Codex query consumes approximately 4.0 Wh of energy. This is 13x more than a traditional Google search (~0.3 Wh).

What is GPT-5.2 Codex's carbon footprint?

Based on the carbon intensity of Azure / Multi-cloud, each query produces approximately 1.7 g of CO2. The grid in this region has a carbon intensity of 450 g CO2/kWh with 25% renewable energy.

How much water does GPT-5.2 Codex use?

Each query consumes approximately 15 mL of water, primarily used for cooling the data centers that process the request.

How does GPT-5.2 Codex compare to a Google search?

A GPT-5.2 Codex query uses 13x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while GPT-5.2 Codex uses 4.0 Wh.

Technical Details

Architecture

Code-optimised Transformer with agentic reasoning

Context window

256,000 tokens

Release date

2026-01-14

Open source

No

Training data cutoff

2025-12