The rise of digital usage—cloud computing, artificial intelligence, streaming, edge computing—has been accompanied by a rapid expansion of the infrastructures that make these technologies possible: data centers. Today, there are estimated to be over 11,000 worldwide, with an annual growth rate of nearly 15%. Behind this dynamic lies an environmental issue that has received little media attention: their massive water consumption.
Every online request, every video viewed, or algorithmic processing mobilizes servers that generate significant heat. To prevent any failures, these devices must be continuously cooled, which requires substantial water usage.
According to a study published in the journal Nature, a small data center with a capacity of 1 megawatt consumes an average of 25.5 million liters of water annually for its cooling needs, equivalent to the annual consumption of about 300,000 people. Globally, the International Energy Agency (IEA) estimates that data centers consumed nearly 560 billion liters of water in 2023. This volume could exceed 1,200 billion liters per year by 2030.
It is important to distinguish between consumed water and abstracted water. The former refers to water that is actually evaporated or not returned to the natural environment, while the latter encompasses all volumes used for construction, operation, and maintenance. According to the IEA, total withdrawals related to data centers could reach nearly 5,000 billion liters in 2023, potentially exceeding 9,000 billion liters by 2030.
This escalation occurs in the context of increasing water tensions. According to Nature Finance, nearly 45% of data centers are located in river basins exposed to a high risk of water scarcity. Some major tech companies concentrate a significant portion of their infrastructure in regions already under water stress, exacerbating the risks of groundwater depletion, competition with agricultural and domestic uses, and weakening local ecosystems.
The rise of artificial intelligence further intensifies this pressure. Large-scale AI models require substantial computational power, increasing energy demands and, consequently, the volume of water needed for cooling. Some projections suggest that by 2027, water consumption related to AI could reach levels comparable to several European countries.
The primary reason for this water intensity lies in cooling technologies. Traditional systems rely on air conditioning, free cooling using outside air when temperatures allow, or liquid cooling solutions. The latter—whether indirect, direct, or immersion—often utilize water or fluids that largely evaporate due to heat. According to industry estimates, nearly 80% of the water used in cooling towers evaporates, necessitating constant replenishment.
In the face of these challenges, optimizing consumption has become both a strategic and environmental imperative. Operators have several levers at their disposal: implementing closed-loop systems to recycle non-evaporated water, using alternative waters (greywater, rainwater, condensates), separating hot and cold air flows to enhance thermal efficiency, storing chilled water overnight, or generalizing free cooling in suitable areas.
Regulations are also evolving. In Europe, the Energy Efficiency Directive requires data centers of a certain capacity to annually report their energy and water consumption, increasing transparency and promoting more responsible practices.
Beyond environmental considerations, the water issue also impacts business continuity. In regions where water is becoming scarce and average temperatures are rising, ensuring effective cooling becomes a critical operational challenge.
As the global economy digitizes, the sustainability of the infrastructures that support it emerges as a central challenge. The future of data centers will hinge not only on their technological performance but also on their ability to balance computational power with water efficiency.


