AI tools consume up to 4 times more water than estimated
A new report shows that artificial intelligence tools, including ChatGPT, are using up to four times more water than previously believed. This discovery raises concerns about the sustainability of #data_centers as AI continues to expand.
Researchers from the University of California, Riverside found that processing 10 to 50 queries on AI chatbots can consume up to 2 liters of water, far exceeding the earlier estimate of half a liter (►https://www.thetimes.com/uk/technology-uk/article/thirsty-chatgpt-uses-four-times-more-water-than-previously-thought-bc0pqsw). The increase is attributed to the intense cooling needs of data centers, where the servers generate significant heat.
According to Microsoft, the energy and water demands of AI models are much higher than anticipated. Between 2023 and 2024, Google, Microsoft, and Meta have reported water usage increases of 17%, 22.5%, and 17% respectively, further highlighting the growing environmental footprint of AI.
This is not just a U.S. issue. In the U.K., planned data centers are expected to consume as much water as a city the size of Liverpool. Meanwhile, in Ireland, data centers now account for 21% of the country’s electricity consumption.
OpenAI CEO Sam Altman recently presented a proposal to the White House to build at least five massive data centers, with plans for unprecedented energy expansions. However, critics argue that the energy production process for AI remains inefficient, with 60% of resources wasted.
While tech companies pledge to offset their water usage by 2030, critics warn that these efforts may not sufficiently address water scarcity in regions where AI data centers are located.
▻https://san.com/cc/ai-tools-consume-up-to-4-times-more-water-than-estimated
#eau #chatgpt #IA #AI #intelligence_artificielle #centre_de_données