Generative AI’s environmental costs are soaring — and mostly secret

/d41586-024-00478-x

  • Generative AI’s environmental costs are soaring — and mostly secret
    https://www.nature.com/articles/d41586-024-00478-x

    Most experts agree that nuclear fusion won’t contribute significantly to the crucial goal of decarbonizing by mid-century to combat the climate crisis. Helion’s most optimistic estimate is that by 2029 it will produce enough energy to power 40,000 average US households; one assessment suggests that ChatGPT, the chatbot created by OpenAI in San Francisco, California, is already consuming the energy of 33,000 homes. It’s estimated that a search driven by generative AI uses four to five times the energy of a conventional web search. Within years, large AI systems are likely to need as much energy as entire nations.

    And it’s not just energy. Generative AI systems need enormous amounts of fresh water to cool their processors and generate electricity. In West Des Moines, Iowa, a giant data-centre cluster serves OpenAI’s most advanced model, GPT-4. A lawsuit by local residents revealed that in July 2022, the month before OpenAI finished training the model, the cluster used about 6% of the district’s water. As Google and Microsoft prepared their Bard and Bing large language models, both had major spikes in water use — increases of 20% and 34%, respectively, in one year, according to the companies’ environmental reports. One preprint1 suggests that, globally, the demand for water for AI could be half that of the United Kingdom by 2027 . In another2, Facebook AI researchers called the environmental effects of the industry’s pursuit of scale the “elephant in the room”.