Code Assistant

GPT-5.3 Codex Environmental Impact

HeavyEstimated

Most capable agentic coding model — SoTA on SWE-Bench Pro

Architecture
Code-optimised Transformer with agentic reasoning
Context
256,000 tokens
Provider
OpenAI
4.5 Wh
Energy per query
2.0 g
CO₂ per query
16 mL
Water per query
15x more than
vs Google search

Energy per query

4.5 Wh

15x more than a Google search (0.3 Wh)

CO2 per query

2.0 g

US East (Virginia) grid (450 gCO₂/kWh)

Water per query

16 mL

~62 queries to fill 1 litre

Processing location

Azure / Multi-cloud

Provider

OpenAI

Category

Code Assistant

Grid carbon intensity

450 g CO2/kWh (25% renewable)

How does GPT-5.3 Codex compare?

Ranked #110 of 152 models by energy per query

0 Wh2 Wh4 Wh6 Wh8 WhCodestralCursorGitHub CopilotGPT-5.3 CodexGoogle search (0.3 Wh)

Detailed Breakdown

Energy Consumption

GPT-5.3 Codex combines the Codex and GPT-5 stacks for state-of-the-art agentic coding. At ~4.5 Wh per step, it is 25% faster than GPT-5.2 Codex with improved interactive steering — users can redirect the agent while it works. The Spark variant runs at 1000+ tokens/sec on Cerebras hardware.

Power Source & Carbon

Runs on OpenAI's multi-cloud infrastructure. The Spark variant uses Cerebras wafer-scale chips for near-instant inference.

Water Usage

At ~16.2 mL per agentic step.

About GPT-5.3 Codex

GPT-5.3 Codex is a code assistant model from OpenAI, released in February 5, 2026. Most capable agentic coding model — SoTA on SWE-Bench Pro. Each query uses 4.5 Wh of energy and produces 2.0 g of CO₂. That's 15x the energy of a Google search — reflecting the computational demands of code assistant.

These figures are estimates derived from hardware specifications and API benchmarks — OpenAI has not published official energy data for GPT-5.3 Codex. Actual consumption may vary significantly depending on batching, quantisation, and infrastructure optimisations that we cannot observe from outside.

GPT-5.3 Codex in Context

87%
potential savings

The efficiency alternative

Devstral 2 performs the same type of task using just 0.60 Wh per query — 87% less energy than GPT-5.3 Codex. For a user sending 25 queries per day, switching would save 35.6 kWh per year.

41.1 kWh
per year

Your yearly GPT-5.3 Codex footprint

At 25 queries per day, your annual GPT-5.3 Codex usage consumes 41.1 kWh — roughly what a fridge uses in a month. That produces 17.9 kg of CO₂.

What does your GPT-5.3 Codex usage cost the planet?

Use our calculator to estimate your personal environmental footprint based on how often you use GPT-5.3 Codex.

Calculate My Compute

Frequently Asked Questions

How much energy does GPT-5.3 Codex use per query?

Each GPT-5.3 Codex query consumes approximately 4.5 Wh of energy. This is 15x more than a traditional Google search (~0.3 Wh).

What is GPT-5.3 Codex's carbon footprint?

Based on the carbon intensity of Azure / Multi-cloud, each query produces approximately 2.0 g of CO2. The grid in this region has a carbon intensity of 450 g CO2/kWh with 25% renewable energy.

How much water does GPT-5.3 Codex use?

Each query consumes approximately 16 mL of water, primarily used for cooling the data centers that process the request.

How does GPT-5.3 Codex compare to a Google search?

A GPT-5.3 Codex query uses 15x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while GPT-5.3 Codex uses 4.5 Wh.

Technical Details

Architecture

Code-optimised Transformer with agentic reasoning

Context window

256,000 tokens

Release date

2026-02-05

Open source

No

Training data cutoff

2026-01