Not Enough Water – AI Growth Sparks Water Crisis Fears

25-02-2025 | By Robin Mitchell

While early computing infrastructures relied on relatively simple setups and modest power requirements, the ever-growing demand for data processing and artificial intelligence has brought resource management to the forefront. Recently, concerns about water scarcity and soaring energy consumption have driven researchers and industry leaders to develop new, more efficient cooling systems—particularly fluid-based innovations aimed at minimising water usage and environmental impact.

Key Things to Know:

  • AI growth is increasing data centre water consumption: The UK’s ambition to become a global leader in AI could significantly raise water demand, with large data centres consuming up to 1 million litres per day.
  • Data centre expansion may strain UK water supplies: Experts warn that AI-driven infrastructure growth could worsen water shortages, especially in already vulnerable regions like the South East.
  • Technological advancements may not be enough: While closed-loop cooling systems offer potential solutions, inconsistent adoption across the industry leaves concerns about long-term sustainability unresolved.
  • Calls for stricter regulations and transparency: Policymakers and water companies lack accurate data on data centre water usage, leading to growing demands for mandatory reporting and efficiency standards.

What are the current sustainability challenges faced by modern computing facilities, how do emerging fluid-based cooling technologies propose to solve them, and what implications will these developments have for the future of data centers in an AI-driven world?

The Pioneering Era of Computing - A Pathway to Resource Challenges

As the world entered the digital age, the computing industry underwent a transformation of unprecedented proportions. The early days of computing saw the establishment of facilities that were not only rudimentary in infrastructure but also laid the groundwork for the large-scale computing operations that would follow. The simplicity of these early frameworks belied their significance, as they marked the beginning of a journey that would lead to the complexities of resource management that the industry faces today.

The first computing facilities were often housed in small buildings or even converted garages, with limited infrastructure and minimal staff. However, despite their humble beginnings, these early setups paved the way for the massive computing centers that would eventually emerge. The rapid growth of the industry created a snowball effect, with each new development building upon the last, ultimately leading to the sophisticated computing systems that dominate the landscape today.

However, as the industry expanded, so did the demands placed on the infrastructure that supported itOne of the most significant challenges that emerged was the rapidly increasing power requirements of modern computing systems. The high-performance computing centers of the early 2000s were already straining the local grids, which were often unaccustomated to the high-volume power consumption of these facilities. The situation was further exacerbated by the fact that these centers were often located in areas with limited access to alternative energy sources, making them reliant on the local grid for their power needs.

In addition to the power demands, the increasing complexity of modern data centers also placed a significant strain on the water resources needed for cooling. The advanced cooling technologies used in these facilities required large amounts of water, which often put a strain on local water supply systems. This was particularly problematic in areas where the water supply chain was fragile or vulnerable to disruptions. The combination of high power demands and water requirements created a perfect storm of challenges for data centers, making it increasingly difficult to meet the growing demands of the computing sector while also ensuring sustainability and environmental responsibility.

Not Enough Water - Concern over UK's plans to become AI world leader

The UK government's plan for the UK to become a world leader in artificial intelligence, as outlined by Prime Minister Keir Stamer, has been met with concerns from industry sources regarding the potential impact on water supplies. The construction of data centres required to power these AI systems can lead to large-scale water usage, which may exacerbate existing water shortages in areas already at risk due to climate change.

Data Centres and Rising Water Demands

While AI-driven technological advancements promise economic growth and efficiency, they come with resource-intensive demands that could place additional strain on the UK's already overstretched water infrastructure. Data centres, which require constant cooling, are expected to see a significant expansion in line with AI development. Given that the UK’s water supply is already under pressure, particularly in the South East, experts argue that this growth must be balanced with robust water management strategies to prevent exacerbating regional drought risks.

How Much Water Do AI Data Centres Consume?

According to Dr. Venkata Uddammeri, an expert in the field of water resources, a typical large data centre could consume between 1.1 and 1 million litres per day. This amount of water is comparable to the daily usage of a town with a population of 50 to 100 thousand people. The water usage in data centers varies greatly depending on factors such as climate, location, and technology used for cooling. However, recent improvements in technology have led to a reduction in water usage. Microsoft's global data centre water usage rose by 33% while developing its first AI tools. In Iowa, a cluster of data centers used 5% of a district's total water supply during the testing of Open AI's Gpt-4. 

Can Technology Solve the Water Crisis? 

Despite technological advancements aimed at improving cooling efficiency, the rapid growth of AI computing infrastructure means that overall water consumption is still expected to rise. The UK’s target of increasing AI computing power twentyfold by 2030 could drive unprecedented demand for cooling resources, with some estimates suggesting data centre water usage could surpass that of entire cities. While closed-loop cooling technologies offer potential solutions, their implementation remains inconsistent across the industry, leaving concerns about long-term sustainability unresolved.

In the United Kingdom, the government has designated data centers as "critical national infrastructure," allowing them to operate with fewer restrictions. However, this move has raised questions regarding the long-term sustainability of these facilities. Thames Water, the UK's largest water and sewerage company, has expressed concerns about the potential water usage of data center operations. In 2020, the water company stated that it had no knowledge of the water usage by its data center customers. 

Regulatory Challenges and the Need for Transparency 

The absence of comprehensive data on water usage by data centres further complicates efforts to develop effective regulatory frameworks. Without accurate monitoring and reporting, policymakers and water companies face difficulties in assessing the full scale of the problem. Introducing mandatory reporting requirements and industry-wide standards for water efficiency could play a crucial role in ensuring that AI development does not come at the cost of regional water security.

Although the UK government has not provided a clear answer regarding how it plans to address the issue of water usage and data centers, it has emphasised that data centers face sustainability problems, including energy demands, and that AI growth zones are designed for areas with existing energy infrastructure. The government also noted that recent changes to the water sector will unlock £104 billion in spending by UK water companies over the next 5 years. 

However, simply increasing water sector investment may not be enough to offset the anticipated rise in data centre demand. Experts argue that a more integrated approach is needed—one that aligns AI expansion with sustainable infrastructure planning. The UK government faces a pressing challenge: balancing its ambition to lead in AI while safeguarding critical natural resources. Without strategic intervention, the unchecked growth of data centres could intensify existing environmental and energy concerns, putting the nation’s net-zero commitments at risk.

Forging a Sustainable Future: Emerging Fluid-Based Cooling Systems and the Future of Computing

As we move forward in the evolution of computing facilities, it is clear that the challenges faced by the industry will only continue to intensify. The increasing demands for power, water resources and other critical infrastructure will require innovative solutions to ensure the sustainability of modern facilities. One area of particular focus is the development of new cooling systems that can effectively manage the heat generated within these facilities while minimising the environmental impact.

AI Data Centre Growth vs. Resource Consumption

The rapid expansion of AI-driven data centres is leading to a significant rise in both energy and water consumption. This table presents projected trends in AI data centre resource usage, highlighting the increasing strain on the UK's infrastructure.

Year Water Consumption (Million Litres/Day) Energy Consumption (TWh)
2020 1.1 3.6
2025 (Est.) 2.5 30
2030 (Est.) 5+ 72


The data highlights the growing demand for water and energy as AI infrastructure expands. Without improved efficiency and policy interventions, AI growth could strain national resources and impact sustainability goals.

Emerging fluid-based solutions have the potential to revolutionise the field by eliminating or greatly reducing water consumption. By utilising closed-loop approaches, these systems can recycle the coolant used to remove heat, thereby eliminating the need for large amounts of water sources. This not only helps to alleviate pressure on local resources but also reduces the overall environmental footprint of the facility. The use of novel coolants that are designed to be more efficient and effective than traditional water-based systems can further enhance the performance of these emerging solutions, enabling for greater energy savings and reduced carbon emissions.

The development of such fluid-based systems will require a multidisciplinary approach, bringing together experts from various fields of engineering, materials science, and environmental studies. Collaboration between industry leaders, researchers, and policymakers will be crucial in driving innovation and implementing sustainable solutions that meet the needs of modern society. As the demand continues to grow for more efficient computing systems and reduced environmental impact, the development and implementation of emerging fluid-based technologies will play a critical role in shaping the future of the IT sector.

The impact of emerging cooling technologies will not be limited to the environmental benefits they provide. The increased efficiency and reduced energy consumption of such systems will also lead to significant cost savings for operators, making these facilities more economically viable. Furthermore, the ability to reduce water usage will help to mitigate the effects of droughts and other water scarcity issues, ensuring a stable supply of this critical resource. The long-term benefits of these solutions will also extend to the broader community, contributing to a more sustainable and environmentally conscious society.

Profile.jpg

By Robin Mitchell

Robin Mitchell is an electronic engineer who has been involved in electronics since the age of 13. After completing a BEng at the University of Warwick, Robin moved into the field of online content creation, developing articles, news pieces, and projects aimed at professionals and makers alike. Currently, Robin runs a small electronics business, MitchElectronics, which produces educational kits and resources.