Energy per query

0.55 Wh

CO2 per query

0.19 g

Water per query

1 mL

Processing location

AWS US East (Virginia) / US West (Oregon)

Provider

Anthropic

Category

Text / Chat

Grid carbon intensity

450 g CO2/kWh (25% renewable)

How does Claude 3.5 Haiku compare?

00.150.30.450.6LLaMA 3.2 1BGemini 1.5 ProGPT-4.1 NanoClaude 3.5 Haiku

Detailed Breakdown

Energy Consumption

Claude 3.5 Haiku is Anthropic's most efficient model, consuming approximately 0.55 Wh per query — roughly 5x less than Claude 3.7 Sonnet. Estimated at ~0.2x Sonnet pricing, suggesting proportionally lower compute requirements.

Power Source & Carbon

Runs on the same AWS infrastructure as other Claude models. Its lower compute requirements make it a significantly greener choice for tasks that do not require the full capabilities of Sonnet or Opus.

Water Usage

At approximately 1 mL per query, Claude 3.5 Haiku's water footprint is negligible — comparable to small open-source models.

What does your Claude 3.5 Haiku usage cost the planet?

Use our calculator to estimate your personal environmental footprint based on how often you use Claude 3.5 Haiku.

Calculate My Compute

Frequently Asked Questions

How much energy does Claude 3.5 Haiku use per query?

Each Claude 3.5 Haiku query consumes approximately 0.55 Wh of energy. This is 2x more than a traditional Google search (~0.3 Wh).

What is Claude 3.5 Haiku's carbon footprint?

Based on the carbon intensity of AWS US East (Virginia) / US West (Oregon), each query produces approximately 0.19 g of CO2. The grid in this region has a carbon intensity of 450 g CO2/kWh with 25% renewable energy.

How much water does Claude 3.5 Haiku use?

Each query consumes approximately 1 mL of water, primarily used for cooling the data centers that process the request.

How does Claude 3.5 Haiku compare to a Google search?

A Claude 3.5 Haiku query uses 2x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while Claude 3.5 Haiku uses 0.55 Wh.

Technical Details

Architecture

Dense Transformer (decoder-only)

Context window

200,000 tokens

Release date

2024-11-04

Open source

No