Text / Chat

Mistral Large 3 Environmental Impact

StandardEstimated

Leading open-weight frontier model — 675B total / 41B active, Apache 2.0

Architecture
Sparse Mixture-of-Experts (multimodal, multilingual)
Parameters
675B
Context
256,000 tokens
Provider
Mistral AI
1.5 Wh
Energy per query
0.95 g
CO₂ per query
6 mL
Water per query
5x more than
vs Google search

Energy per query

1.5 Wh

5x more than a Google search (0.3 Wh)

CO2 per query

0.95 g

France grid (50 gCO₂/kWh)

Water per query

6 mL

~161 queries to fill 1 litre

Processing location

Mistral AI (France) / self-hosted

Provider

Mistral AI

Category

Text / Chat

Grid carbon intensity

50 g CO2/kWh (90% renewable)

How does Mistral Large 3 compare?

Ranked #78 of 152 models by energy per query

0 Wh0.4 Wh0.8 Wh1.2 Wh1.6 WhLLaMA 3.2 1BGemini 1.5 ProGPT-4.1 NanoMistral Large 3Google search (0.3 Wh)

Detailed Breakdown

Energy Consumption

Mistral Large 3 is the leading open-weight frontier model at 675B total parameters with only 41B active per token (MoE). At ~1.5 Wh per query, it's remarkably efficient for its capability level thanks to the sparse architecture. Multimodal and multilingual, Apache 2.0 licensed.

Power Source & Carbon

Mistral's own API runs from France (50 g CO2/kWh — among the cleanest grids globally due to nuclear power). Self-hosted deployments benefit from the MoE architecture requiring less active compute.

Water Usage

At ~6.2 mL per query. European data centres generally have good water efficiency.

About Mistral Large 3

Mistral Large 3 is an open-source text and chat model from Mistral AI, released in December 2, 2025, that runs well below the category average for energy consumption at 1.5 Wh per query. Because its weights are publicly available, it can be self-hosted on any infrastructure — meaning its carbon footprint depends entirely on where and how you choose to run it. At 675B parameters, it leading open-weight frontier model — 675b total / 41b active, apache 2.0.

Mistral Large 3 benefits from running in France, one of the cleaner grid regions in our dataset at 50 gCO₂/kWh with 90% renewable energy. The same model running in a coal-heavy region would produce significantly more carbon per query.

These figures are estimates derived from hardware specifications and API benchmarks — Mistral AI has not published official energy data for Mistral Large 3. Actual consumption may vary significantly depending on batching, quantisation, and infrastructure optimisations that we cannot observe from outside.

Mistral Large 3 in Context

13.7 kWh
per year

Your yearly Mistral Large 3 footprint

At 25 queries per day, your annual Mistral Large 3 usage consumes 13.7 kWh — roughly what a fridge uses in a month. That produces 8.7 kg of CO₂.

Key Insights

Runs on a 90% renewable grid — among the cleanest AI inference locations
Open-source weights — can be self-hosted on infrastructure you control

What does your Mistral Large 3 usage cost the planet?

Use our calculator to estimate your personal environmental footprint based on how often you use Mistral Large 3.

Calculate My Compute

Frequently Asked Questions

How much energy does Mistral Large 3 use per query?

Each Mistral Large 3 query consumes approximately 1.5 Wh of energy. This is 5x more than a traditional Google search (~0.3 Wh).

What is Mistral Large 3's carbon footprint?

Based on the carbon intensity of Mistral AI (France) / self-hosted, each query produces approximately 0.95 g of CO2. The grid in this region has a carbon intensity of 50 g CO2/kWh with 90% renewable energy.

How much water does Mistral Large 3 use?

Each query consumes approximately 6 mL of water, primarily used for cooling the data centers that process the request.

How does Mistral Large 3 compare to a Google search?

A Mistral Large 3 query uses 5x more than a Google search in terms of energy. A Google search uses approximately 0.3 Wh, while Mistral Large 3 uses 1.5 Wh.

Technical Details

Architecture

Sparse Mixture-of-Experts (multimodal, multilingual)

Parameters

675B

Context window

256,000 tokens

Release date

2025-12-02

Open source

Yes

Training data cutoff

2025-11