Generative AI promises unprecedented advancements, but its rapid development brings significant environmental challenges, including surging energy and water consumption.
Key Points at a Glance
- Training generative AI models like GPT-4 requires immense computational power, driving increased electricity and water usage.
- Data centers supporting AI demand have doubled their energy consumption in recent years, rivaling the electricity use of nations.
- Generative AI inference—daily operations and user queries—poses ongoing environmental impacts even after model training.
- Short AI model lifecycles exacerbate resource waste, as newer, larger models quickly replace older versions.
Generative AI’s potential to revolutionize industries—from healthcare to creative arts—has been met with widespread enthusiasm. Yet, behind its rapid growth lies a significant environmental cost. Powerful models like OpenAI’s GPT-4 require vast computational resources to train and operate, contributing to increased energy use, carbon emissions, and water consumption.
“When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in,” says Elsa A. Olivetti, professor of materials science and engineering at MIT. “There are much broader consequences that go out to a system level.”
In a 2024 paper published by MIT researchers, Olivetti and her colleagues explore the climate and sustainability implications of generative AI. Their findings emphasize the urgent need to address these challenges as the technology becomes a cornerstone of modern applications.
Generative AI relies heavily on data centers, temperature-controlled facilities housing servers, storage drives, and network equipment. These centers power tasks ranging from model training to user queries. However, their energy requirements are staggering.
Scientists estimate that global electricity consumption by data centers reached 460 terawatts in 2022, placing them among the world’s top electricity consumers. Driven by generative AI demands, this figure is expected to climb to 1,050 terawatts by 2026.
“Generative AI training clusters consume seven or eight times more energy than typical computing workloads,” explains Noman Bashir, lead author of the MIT impact paper. Such clusters are responsible for sharp increases in data center construction and operation, often powered by fossil fuel-based energy grids.
Training OpenAI’s GPT-3, for instance, consumed an estimated 1,287 megawatt-hours of electricity, enough to power 120 average U.S. homes for a year. The process generated approximately 552 tons of carbon dioxide, underscoring the environmental trade-offs inherent in AI innovation.
Data centers also require vast amounts of water for cooling. For every kilowatt-hour of energy consumed, about two liters of water are used to prevent overheating. This strain on municipal water supplies and ecosystems poses a direct threat to biodiversity.
Meanwhile, the production of GPUs—the high-performance processors essential for AI workloads—carries its own environmental footprint. Manufacturing GPUs involves resource-intensive processes, including the mining of raw materials and the use of toxic chemicals. In 2023 alone, NVIDIA, AMD, and Intel shipped nearly 3.85 million GPUs to data centers, a sharp rise from 2022 figures.
“Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud,” Bashir notes. “Data centers are present in our physical world and have direct implications for water usage and biodiversity.”
The environmental toll of generative AI does not end after model training. Each user interaction, such as querying ChatGPT, requires energy-intensive computations. A single ChatGPT query consumes about five times more electricity than a simple web search.
“As a user, I don’t have much incentive to cut back on my use of generative AI,” says Bashir, emphasizing the disconnect between user convenience and environmental awareness. Moreover, the short shelf-life of AI models leads to resource waste, as companies release larger, more energy-hungry models at an accelerated pace.
The MIT team advocates for a holistic approach to mitigate generative AI’s environmental impact. This includes:
- Comprehensive impact assessments: Evaluating the trade-offs of AI applications against their perceived benefits.
- Transparent reporting: Encouraging companies to disclose the energy and water usage of their AI systems.
- Sustainable innovation: Investing in energy-efficient algorithms and alternative cooling technologies.
“We need a contextual way to systematically understand the implications of new developments in this space,” Olivetti concludes. By addressing these challenges, the AI industry can align its growth with global sustainability goals.