Understanding Energy and Water Usage in Large Language Models (LLMs) and AI
As artificial intelligence, particularly large language models (LLMs) like GPT, continues to grow in capability and popularity, questions around the environmental impact of these technologies have become increasingly urgent. Two critical resources involved in the lifecycle of AI systems are energy and water. These resources are consumed at different stages of development and deployment, each with distinct implications for sustainability.
Energy Consumption in AI
Energy usage in AI is primarily concentrated in two phases: training and inference.
- Training: Training a state-of-the-art LLM involves massive parallel computations over several weeks or months. For example, training GPT-3 is estimated to have consumed around 1,287 MWh of electricity—enough to power an average American home for over 120 years. Most of this energy goes into operating data centers, cooling infrastructure, and specialized hardware like GPUs and TPUs.
- Inference: Once trained, running the model (inference) consumes much less energy per task, but at scale, inference can outweigh training energy consumption. Billions of queries per day across cloud platforms mean the cumulative energy costs remain substantial.
Water Usage and AI
Less discussed but equally important is the water footprint of AI. Water is used extensively in data centers for cooling, particularly when servers operate under high computational loads.
- According to a 2023 University of California–Riverside study, training GPT-3 in Microsoft’s U.S. data centers is estimated to have consumed ~700,000 liters of clean freshwater—equivalent to the water needed to produce over 300 cars or 370 pairs of jeans.
- Inference also contributes to water use, especially in regions that rely on evaporative cooling systems. On average, it is estimated that 500 ml of water is used every time you ask a large model a complex question.
Geographic and Temporal Variability
The environmental impact of LLMs also varies significantly depending on:
- Data center location: Water usage and energy sources differ by region. Data centers in dry or drought-prone areas can place additional stress on local water supplies.
- Time of day: Some companies have begun scheduling AI training at times when electricity grids are greener and water availability is higher.
- Cooling technology: Facilities using direct air or liquid immersion cooling tend to have different resource profiles than those using evaporative methods.
Toward Sustainable AI
Reducing the resource intensity of AI requires a multi-pronged approach. Leading companies are investing in energy-efficient chips, renewable energy sourcing, and model optimization to decrease carbon and water footprints. Additionally, efforts to improve model efficiency—such as smaller distilled models and dynamic computation—offer promising paths forward.
As AI continues to scale globally, addressing its energy and water demands will be crucial for ensuring that progress in machine intelligence aligns with the broader goals of environmental sustainability and responsible innovation.
📚 Further Reading on AI's Environmental Impact
- Explained: Generative AI’s Environmental Impact – MIT News
A clear breakdown of how generative AI models like ChatGPT consume electricity and water, and what that means for sustainability.
- As Use of A.I. Soars, So Does the Energy and Water It Requires – Yale E360
A comprehensive overview of AI’s growing energy and water demands, highlighting the need for sustainable practices.
- AI Energy And Water Usage Calculator
A calculator for estimating how much energy and water a single LLM response generates.
- AI Has an Environmental Problem. Here's What the World Can Do – UNEP
Explores the environmental challenges posed by AI, including electronic waste and energy consumption, and suggests actionable solutions.
- AI Is Accelerating the Loss of Our Scarcest Natural Resource: Water – Forbes
Discusses the significant water consumption associated with AI technologies and the implications for global water resources.
- Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models – arXiv
An academic study revealing the substantial water usage of AI models and proposing methods to mitigate this hidden environmental cost.
- Exploring the Sustainable Scaling of AI Dilemma – arXiv
Analyzes the challenges of scaling AI sustainably, emphasizing the need for coordinated efforts across the AI value chain.
- The Hidden Cost of AI Energy Consumption – Wharton
Highlights the often-overlooked energy costs of AI and the importance of proactive measures to address them.
- Artificial Intelligence: How Much Energy Does AI Use? – UNRIC
Provides insights into AI's energy consumption patterns and the global efforts to make AI more energy-efficient.
- AI's Environmental Impact: Energy Consumption and Water Use – Planet Detroit
Examines the environmental footprint of AI, focusing on energy and water usage, and the potential strain on infrastructure.
- Can We Mitigate AI's Environmental Impacts? – Yale School of the Environment
Discusses strategies for reducing AI's environmental impact, including enhancing energy efficiency and environmental monitoring.