A new study from the University of California Riverside and the California Institute of Technology (Caltech) reveals that the rapid expansion of artificial intelligence is placing unprecedented demand on municipal water supplies. As cloud computing grows, servers require massive amounts of cooling during peak temperatures, creating sudden spikes in demand that local systems cannot currently support. Upgrading water infrastructure to meet these extreme short-term needs is becoming a challenge for communities nationwide.
The study shows that if current trends continue, US facilities could require up to 1.45 billion gallons of new peak water capacity per day by 2030. That is equivalent to roughly the entire daily supply of New York City. Building the necessary treatment plants, storage tanks, and pipelines to handle this surge could cost anywhere from $10 billion to $58 billion. In February 2026 alone, major technology firms committed nearly $1 billion to secure multi-million-gallon daily water supplies across Virginia, Louisiana, and Indiana.
While power consumption is often cited as the main hurdle for tech expansion, the research emphasizes that water availability is an even more binding constraint. Even with unlimited funding for new infrastructure, natural water sources like reservoirs and snowpack cannot simply be purchased or manufactured during peak demand periods. To mitigate these risks, the study recommends that developers transparently report their peak water needs and collaborate with local utilities to fund vital upgrades or pivot to alternative dry cooling methods.