AI's Water Footprint Explained

6 min read

How AI data centres consume water for cooling, the difference between direct and indirect water use, and what's being done to reduce the impact.

Why AI needs water

Computers generate heat. The more powerful the computer, the more heat it produces. A modern AI GPU like NVIDIA's H100 has a thermal design power of 700 watts — it converts 700 joules of electrical energy into heat every second. Pack thousands of these chips into a data centre, and you face a serious cooling challenge.

Water is one of the most efficient heat transfer mediums available. It can absorb far more heat per unit volume than air, which is why many data centres rely on water-based cooling systems — particularly evaporative cooling towers, which work by allowing water to evaporate, carrying heat away in the process. The water that evaporates is consumed: it does not return to the local water system.

Two types of water use

AI's water footprint has two distinct components, and conflating them leads to misleading conclusions.

Direct water (on-site)

This is the water consumed at the data centre itself, primarily through evaporative cooling towers. When warm air from the server rooms passes through these towers, water evaporates to cool it down. This water is drawn from local municipal supplies, rivers, or wells, and is permanently consumed — it does not return to the source.

This is measured using site Water Usage Effectiveness (WUE), expressed in litres per kilowatt-hour. Google's fleet-wide site WUE was 0.74 L/kWh in 2023, meaning 0.74 litres of water were consumed on-site for every kilowatt-hour of energy used. Some providers, including those using immersion cooling or operating in cool climates, report near-zero direct water use.

Indirect water (off-site)

The electricity powering the data centre must be generated somewhere, and many power plants — coal, natural gas, and nuclear — use water for cooling in their own steam cycles. This "upstream" water consumption is often larger than the direct use at the data centre itself.

This is measured using source WUE, which includes both on-site and off-site water. In coal-heavy grids, source WUE can exceed 4 L/kWh. In regions powered by wind, solar, or hydroelectric generation, off-site water consumption approaches zero — though hydroelectric dams do lose water to evaporation from their reservoirs.

Regional variation matters enormously

The water impact of an AI query depends heavily on where the data centre is located, for two reasons: the local climate affects how much direct cooling water is needed, and the regional electricity grid determines how much indirect water is consumed for power generation.

A data centre in Phoenix, Arizona, where summer temperatures regularly exceed 40 degrees C, requires far more evaporative cooling than one in Stockholm, Sweden, where ambient air can cool the servers for most of the year. Nordic data centres often run with near-zero direct water consumption.

At the same time, a data centre powered by coal (as in parts of China, India, or the US Midwest) carries a large indirect water burden from the power plant. One powered by wind and solar has essentially no indirect water footprint.

This is why our calculator lets you select a region: the same model can have a 10x difference in water impact depending on where it runs.

The aggregate scale

A single ChatGPT query consumes roughly 2 to 5 millilitres of water — barely a sip. But scale matters. OpenAI serves hundreds of millions of queries per day across all its products. Google processes billions of searches, increasingly augmented by AI overviews.

In 2023, US data centres consumed an estimated 64 billion litres of water, according to research from the University of California, Riverside. Microsoft reported a 34% increase in water consumption between 2021 and 2022, which the company attributed largely to AI workloads. Google's water consumption rose 20% over the same period.

These numbers are growing. The International Energy Agency projects that global data centre electricity consumption will more than double by 2030, and water consumption will follow a similar trajectory, barring significant changes in cooling technology.

What providers are doing about it

Water-positive commitments

Google has pledged to become "water positive" by 2030, meaning it will replenish more water than it consumes. Microsoft has made a similar commitment. These programs typically involve investing in watershed restoration, water recycling infrastructure, and access to clean water in water-stressed regions. While these commitments are meaningful, they do not eliminate local impacts — the water consumed in Phoenix still comes from the Arizona water supply.

Immersion and liquid cooling

Immersion cooling submerges servers in a non-conductive liquid that absorbs heat directly, eliminating the need for water-based evaporative cooling entirely. The liquid circulates in a closed loop, losing no water to evaporation. Microsoft, NVIDIA, and several data centre operators are investing heavily in this technology. While adoption is still limited, it could dramatically reduce direct water consumption as it scales.

Climate-aware siting

Newer data centres are increasingly built in cool climates where "free cooling" — using ambient air instead of water — is possible for most of the year. Finland, Sweden, Norway, and Iceland have become popular locations. Google's Hamina data centre in Finland uses seawater from the Gulf of Finland for cooling, avoiding freshwater consumption entirely.