Artificial intelligence (AI) has rapidly permeated various aspects of modern life, from powering digital assistants and streamlining online shopping to enabling complex data analysis and scientific breakthroughs. However, this technological revolution comes with a hidden environmental cost: a rising carbon footprint that demands careful consideration.
The environmental impact of AI arises primarily from the substantial energy consumption required to train and operate deep learning models. These models, especially large language models (LLMs) and generative AI, necessitate enormous computational resources. Training these models involves thousands of graphics processing units (GPUs) running continuously for months, leading to high electricity consumption. For example, training a single AI model can produce around 626,000 lbs (283 tons) of carbon dioxide. To illustrate, the training of GPT-3 released 552 metric tons of carbon dioxide into the atmosphere.
Data centers, the hubs of AI activity, are significant consumers of electricity and water. Globally, data centers consumed 460 terawatt-hours in 2022. By 2030, data centers worldwide are projected to more than double their electricity demand, reaching around 945 terawatt-hours, exceeding Japan's total electricity consumption. In the United States, data centers' power consumption could account for almost half of the growth in electricity demand between now and 2030. The increasing demand for high-performance computing hardware has also spurred indirect environmental impacts from manufacturing and transport. The need for advanced cooling systems in AI data centers leads to excessive water consumption, posing environmental consequences in regions facing water scarcity.
The energy source powering AI infrastructure significantly impacts its carbon footprint. AI algorithms running in locations predominantly using fossil fuels for energy will exert a much higher carbon footprint than those in areas with cleaner energy sources. Many tech companies claim to offset energy usage by purchasing energy from renewable sources.
The use of AI to make predictions, known as inference, also contributes to the overall carbon footprint. Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search. While the energy consumption per prompt may seem small, the cumulative impact of billions of daily queries is enormous. Google estimates the median Gemini Apps text prompt consumes 0.24 Wh of energy. This is equivalent to watching nine seconds of television.
E-waste is another growing concern. The short lifespan of GPUs and other high-performance computing (HPC) components results in a growing problem of electronic waste as obsolete or damaged hardware is frequently discarded.
Despite the challenges, AI also offers potential solutions for mitigating climate change. AI can improve the efficiency of renewable energy systems, predict floods, and make traffic more efficient. It can also be used in satellite monitoring to track global climate change impacts and progress on sustainability targets. Furthermore, AI can generate complex forecasts to model future scenarios in real-time, benefiting climate resilience and adaptation efforts.
To address the rising carbon footprint of AI, several strategies can be adopted:
Google's efforts have demonstrated significant progress in reducing the environmental impact of AI. Over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33x and 44x, respectively, while delivering higher quality responses. These gains are attributed to research innovations and software and hardware efficiency improvements. The rising carbon footprint of AI presents a significant environmental challenge. However, by understanding the sources of this impact and implementing strategies to mitigate it, we can harness the power of AI while minimizing its environmental consequences.