243 – The Whole Truth About AI and Water Consumption

The Whole Truth About AI and Water Consumption

In Texas, in 2025, data centers used about 90 billion liters of water, including cooling and water linked to electricity production. By 2030, estimates reach up to 600 billion liters per year.

In just a few years, Artificial Intelligence is competing with agriculture, cities, and industry. Computing systems generate heat. Heat must be removed. Some data centers can use up to 1.5 million liters of water per day just for cooling.

You might think: water evaporates and comes back as rain. So what is the problem?

Water returns, but not where it is needed and not when it is needed. A data center draws fresh water from a local basin. Part of it evaporates and leaves that territory. It may fall elsewhere, even over the ocean, or months later. Meanwhile, the local community has less water available, often during the hottest weeks, when demand is already high.

There is another issue. Demand grows fast. Water recharge is slow. Aquifers are not infinite. If withdrawals exceed natural recharge, water levels drop, wells go deeper, costs rise, drought risks increase. A global water cycle does not fix a local shortage created in a few years.

There is also indirect water use. Electricity production often consumes water. That indirect share can represent up to 75% of the total footprint. The data center reports one number. The energy-producing region carries another, often larger, burden.

AI infrastructure expands where land and energy are cheaper and permits move faster. These areas are often already exposed to heat and water stress. Multiply that pattern across regions and the pressure on local water systems increases.

Water does not disappear. Availability, in the right place at the right time, becomes scarcer. What do you think?

#ArtificialDecisions #MCC

Share: