Here we explain how to calculate yours

0
4


Artificial intelligence systems (AI) consume up to 500 milliliters of water (an individual bottle) for each brief conversation that a user maintains with the GPT-3 version of the OpenAI chatgpt system. They use approximately the same amount of water to write a 100 words email.

This figure includes the water used to refrigerate the servers of the data center and the water consumed in the electrical plants that generate the necessary electricity for its operation.

But the study that calculated these estimates also pointed out that the water consumption of AI systems can vary considerably, depending on where and when the computer that responds to the consultation is being executed.

For me, as an academic librarian and education teacher, understanding AI is not just about knowing how to write messages. It also implies understanding infrastructure, disadvantages and civic decisions that surround it.

Many people assume that AI is intrinsically harmful, especially after the headlines that denounce their enormous energy and water footprint. Those effects are real, but they are only part of the story.

When people stop seeing AI as a simple drainage of resources and understand their real impact, where their effects come from, how they vary and what can be done to reduce them, they are much better prepared to make decisions that balance innovation with sustainability.

Two hidden currents

Behind each consultation of AI are two water consumption currents.

The first is the in situ cooling of servers that generate huge amounts of heat. This usually uses evaporation cooling towers: gigantic nebulizers that spray water on hot pipes or open deposits. Evaporation dissipates heat, but that water is extracted from the local water supply, such as a river, a reservoir or an aquifer. Other cooling systems can consume less water, but more electricity.

The second current is used by the power plants that generate electricity to feed the data center. Coal, gas and nuclear plants use large volumes of water for steam and cooling cycles.

Hydroelectric energy also consumes significant amounts of water, which evaporates the reservoirs. Solar concentration plants, which work more similar to traditional vapor power plants, can consume a lot of water if they depend on wet cooling.

On the other hand, wind turbines and solar panels practically do not consume water once built, except for occasional cleaning.

You are interested: Data centers consume immense amounts of water but rarely inform exactly how

Climate and time are important in AI water consumption

Water consumption varies dramatically according to location. A data center in the cold and wet Ireland can often depend on the outer air or coolers and operate for months with a minimum water consumption. On the other hand, a data center in Arizona, in July, can depend largely on evaporation cooling. The hot and dry air makes this method very effective, but it also consumes large volumes of water, since evaporation is the mechanism that eliminates heat.

Time is also important. A study by the University of Massachusetts Amherst discovered that a data center could use only half of water in winter than in summer. And at noon, during a heat wave, refrigeration systems work extra hours. At night, the demand is less.

The most recent approaches offer promising alternatives. For example, immersion refrigeration submerges servers in fluids that do not conduct electricity, such as synthetic oils, which reduces water evaporation almost completely.

A new Microsoft design claims not to use water for refrigeration, since it circulates a special liquid through sealed pipes directly on computers chips. The liquid absorbs heat and releases it through a closed circuit system without evaporation. Data centers would continue to use drinking water for bathrooms and other staff facilities, but refrigeration in themselves would no longer be based on the local water supply.

However, these solutions were not yet generalized, mainly due to the cost, the complexity of maintenance and the difficulty of adapting the existing data centers to the new systems. Most operators depend on evaporation systems.

A simple skill you can use

The type of consulted AI model is also important. This is due to the different levels of complexity, the hardware and the power of the processor they require. Some models can consume many more resources than others. For example, a study revealed that certain models can consume more than 70 times more energy and water than ultra -efficient.

You can estimate the water footprint of AI yourself in just three steps, without the need for advanced mathematical calculations.

Paso 1: Look for reliable research or official publications. Independent analysis estimate that an GPT-5 response of average length, consisting of between 150 and 200 words of exit, or approximately between 200 and 300 tokens, consumes approximately 19.3 watts-Hora. An GPT-4o response of similar length consumes approximately 1.75 watts-room.

Paso 2: Use a practical estimate of the amount of water per unit of electricity, combining use for refrigeration and energy.

Independent researchers and sector reports suggest that a reasonable rank today is approximately 1.3 to 2.0 milliliters per watt-hora. The lower limit reflects efficient facilities that use modern cooling and cleaner networks. The upper limit represents more typical facilities.

Paso 3: Now is the time to join the pieces. Take the energy value obtained in Step 1 and multiply it by the water factor of Step 2. This will give the water footprint of a single response of AI.

This is the formula of a line you will need:

Energy by application (watts-hora) × Water factor (milliliters per watt-hora) = water at request (in milliliters).

For a medium length consultation to GPT-5, that calculation must use the figures of 19.3 watts-hora and 2 milliliters per watt-hora. 19.3 x 2 = 39 milliliters of water by response.

For a medium length consultation to GPT-4O, the calculation is 1.75 watts-hora x 2 milliliters per watt-hora = 3.5 milliliters of water by response.

If it is assumed that the data centers are more efficient and use 1.3 milliliters per watt-room, the figures decrease: about 25 milliliters for GPT-5 and 2.3 milliliters for GPT-4O.

A recent Google technical report indicated that a medium text application to its Gemini system uses only 0.24 Electricity watts and about 0.26 milliliters of water, approximately the volume of five drops. However, the report does not specify the length of said application, so it cannot be compared directly to GPT water consumption.

These different estimates, ranging from 0.26 milliliters to 39 milliliters, demonstrate the importance of efficiency effects, AI model and energy generation infrastructure.

Comparisons can provide context

To really understand how much water these consultations consume, it may be useful to compare them with other common water uses.

By multiplying it by millions, the water consumption of the consultations of AI adds. OpenAI reports around 2,500 million applications per day. This figure includes consultations to its GPT-4O, GPT-4 Turbo, GPT-3.5 and GPT-5 systems, without a public breakdown of how many consultations each particular model is issued.

The use of independent estimates and official Google reports offers an idea of ​​the possible range:

  • All medium requests of Google Gemini: about 650,000 liters per day.
  • All medium requests of GPT 4: about 8.8 million liters per day.
  • All medium requests of GPT 5: about 97.5 million liters per day.
  • A small black tap throws a jet of water on a green grass.

By way of comparison, Americans use around 34,000 million liters per day to water residential caps and gardens. A liter is approximately a gallon room.

The generative AI uses water, but, at least for now, its daily totals are low compared to other common uses such as grass, showers and laundry.

However, its water demand is not fixed. Google’s dissemination shows what can be achieved when systems are optimized with specialized chips, efficient cooling and intelligent workload management. Recycling water and locating data centers in colder and more humid regions can also be useful.

Transparency is also important: when companies publish their data, the public, political leaders and researchers can see what can be achieved and compared to suppliers fairly.

*Leo S. lo He is dean of libraries; Rector’s advisor for literacy at AI; Education professor, University of Virginia.

This text was originally published in The Conversation

Inspy, discover and share. Follow us and find what you are looking for on our Instagram!




LEAVE A REPLY

Please enter your comment!
Please enter your name here