The GenerIA Blog

AI and environment (2/3): water, critical issue!

Blog post illustration
Share this article:

Artificial intelligence - at what cost to our water resources? Just like its carbon footprint, Conventional AI's consumption of cooling water is becoming a real ecological threat.

Did you know? An ordinary interaction with ChatGPT, i.e. between 10 and 50 turns of conversation, consumes around half a liter of fresh water (corrosive seawater is forbidden in the digital world). This may seem insignificant, but multiplied by millions of daily users, these 50 small centilitres turn out to be quite problematic.

Particularly if we consider them as a trend. Google officially reports a 20% annual increase in its water consumption since 2021, with around 6.1 billion liters by 2023. To put that in perspective, that's 150% the annual consumption of the entire Canadian population. Thanks Gemini! Microsoft faces a 34% increase over the same period, at around 6.4 billion liters, bearing in mind, for example, that a single GPT-3 training can evaporate 700,000 liters of generally potable water.

Depending on calculation methodologies, global AI use could thus be responsible for the withdrawal of between 4.2 and 6.6 billion cubic meters by 2027. That's between 4 and 6 times the total annual withdrawal of Denmark, or half that of the UK (therefore including Northern Ireland). In the USA, again by 2027, AI-related requirements could represent between 0.5 and 0.7% of annual water withdrawal. Just for AI, excluding other "IT" fields. That's a lot.

How AI consumes water

The data centers that host AI systems generate an enormous amount of heat. They require permanent cooling to maintain an optimum operating temperature of between 10°C and 27°C. Typically, this process involves pumping water from a natural or artificial source (a lake, for example), conveying it to the building's ducts and then into the servers or their direct periphery. Once hot, the water is mechanically redirected to a cooling tower, where evaporation dissipates the heat into the air. This targeted air-conditioning is also a source of diffuse heat. And to make matters worse, regular purging of these systems is imperative to rid them of accumulations of salts and other elements - a process which also requires clean water.

For most industrial uses of water, there are two types of impact on the resource:

  1. Withdrawal, i.e. the volume of water withdrawn from underground or surface sources. Withdrawal implies dependence. It is therefore a cause of competition between industrial players, economic sectors or neighboring countries. Geopolitical experts predict the first real “water wars” for the 2030 decade.
  2. Consumption, i.e. the difference in volume between water withdrawn and water discharged. Consumption has a major impact on downstream availability, which also causes major inter-regional tensions. It is also crucial for assessing the severity of shortages at watershed level.

Data centers make a significant contribution to both these impacts. Unlike residential use, where water is discharged into the sewer and then treated before reuse, data center cooling towers evaporate water into the atmosphere, where it can remain for up to twelve months before falling back, in part only, onto our common soil.

A growing number of initiatives try recover ITC-dissipated heat, for example to heat buildings or swimming pools. But however commendable these efforts may be, they're far from offsetting the expense, let alone ensuring a minimum of sustainability, especially as their benefits do not increase in line with consumption.

In 2011, the concept of WUE (Water Usage Effectiveness) was introduced to better understand the problem. This metric is used to assess the efficiency of water consumption per kW/h in data centers. It is to water what the better-known PUE (Power Usage Effectiveness) is to energy. In fact, Meta reports an average WUE of 0.26 (liter per kW/h) for all its computing infrastructures. Microsoft's official communication reports a WUE of 0.49 globally, and 0.1 in the EMEA zone. These are quite remarkable figures, which reflect real efforts. But there's still a long way to go.

For this massive consumption of water by AI is part of an already alarming context of global attrition. According to the United Nations, nearly 700 million people are currently affected by water shortages, some of these shortages very severe. By 2025, 1.8 billion people will be living in regions affected by extreme water shortages. By 2030, almost half the world's population will be suffering from permanent water stress, including near-total shortages for at least thirty to sixty continuous days each year.

Major challenges and immediate common sense

These forecasts bring the AI industry face to face with its responsibilities. With two major challenges that will likely prove difficult to overcome.

The first one concerns the balance between green energy and water consumption. Sunny regions allow more solar energy, but require more water for cooling. Building in cooler or even polar regions could reduce water consumption, but this is not always possible, especially as logistical problems are compounded by the consequences of remoteness and isolation (network flows, etc.).

The second challenge concerns the publication of data, transparent if possible, so that remedial strategies can be put in place. For example, figures for periods of intensive water use for cooling computing servers are lacking, preventing the introduction of off-peak hours for model producers or users whose inference results do not need to be immediate.

The literature and various industry-sponsored studies point to several ways of tackling these challenges:

  • Improving liquid cooling efficiency
  • Implement water recycling and conservation measures
  • Explore alternative cooling methods that consume less water
  • Integrate water footprints (like carbon footprints) into AI model scorecards.

While these recommendations are obviously all laudable, we at GenerIA believe that the first discipline is to remain frugal from end to end, from model building to inference consumption. The concept of overall frugality includes production methods, algorithmic approaches, the right selection of data serving as models foundation and the control of usage (legitimate business-related activity vs personal convenience). We recommend that each of these dimensions be adjusted on a per usecase basis. Before any metrology, avoiding waste is simply a matter of common sense.

Conclusion

We can no longer ignore the fact that AI's water footprint is reaching levels that will ultimately damage humanity. To continue to benefit from its power, we must not only innovate to minimize its environmental impact, but also reconsider the way it is created and used, giving its water consumption the same critical importance as its carbon emissions.

References

The Green Grid

Google's 2024 Environmental Report

Microsoft's 2024 Environmental Sustainability Report

Meta's 2023 Sustainability Report

Center for Data Innovation: Rethinking Concerns About Energy Use

UNESCO's Recommendation on the Ethics of Artificial Intelligence

United Nations Report : Drought in numbers 2022

Making AI Less "Thirsty"

In the GenerIA blog:

Article Image

Regulating Frugal AI: Between Progress and Challenges...

Frugality is a radical shift in the way businesses and governments think about AI. But how do we regulate a technology that promises both performance and a sustainable environmental footprint? Let's take a look at how three major regions - Canada, Europe and the United States - are approaching the problem...

Article Image

AFNOR SPEC 2314: Best Practices in Frugal AI

From project design to end-user acculturation, frugal AI is above all a matter of best practices. Numerous and complementary, these BPs are detailed in AFNOR SPEC 2314. Here is a thematic summary.

Article Image

Frugal AI: A Gentle Introduction to the AFNOR SPEC 2314 Framework

Fostering innovation without hastening the attrition of natural resources. This is the rationale behind frugal artificial intelligence, whose definition, contours and practices AFNOR intends to normalize.

Article Image

Telemetry, an essential component of the best AIs

Extensive telemetry brings a great deal to enterprise artificial intelligence. Performance, behavior, response biases, prompt injections... Everything that can be observed contributes to continuous optimization, thereby guaranteeing the full success of AI projects.

Article Image

AI and environment (3/3): the systemic risks

Overloaded power grids, the return of fossil fuels, non-recycled electronic waste, skyrocketing social costs... Conventional AI's systemic and societal indicators are all red.

Article Image

AI and environment (1/3): alarming numbers

Insatiable for energy and a major producer of CO2, conventional artificial intelligence looks more and more like an environmental dead end. Is there any hope of sustainability? Everywhere, the numbers suggest otherwise...