Generative AI consumes significant amounts of electricity and water for its operation, and this issue is projected to escalate further
Technology never operates in isolation, as evidenced by the recent surge in cryptocurrencies over the past couple of years. While many individuals were reaping substantial profits from investing in bitcoin and its rivals, concerns arose about the environmental consequences brought on by these get-rich-quick speculators.
The process of mining cryptocurrency put a significant strain on the environment. The fundamental idea was that to attain wealth, one had to expend considerable effort. To produce a bitcoin or other cryptocurrencies, the initial step involved “mining” them. This process demanded computers to solve complex equations, and if successfully accomplished, a new entry would be added to the blockchain.
With the advent of the crypto craze, people started engaging in large-scale operations, acquiring high-powered computer chips known as GPUs (graphics processing units) that could mine cryptocurrencies much faster than ordinary off-the-shelf components. The demand for these chips was so intense that Goldman Sachs estimated 169 industries were impacted by the 2022 chip shortage. Moreover, these computer chips demanded substantial amounts of electricity to fuel their operations; in fact, bitcoin mining alone consumes more electricity than the combined usage of Norway and Ukraine.
The environmental impact of the cryptocurrency frenzy is still being assessed, as highlighted by the Guardian in its report this April.
The Environmental Impact of AI
The AI Revolution and Its Environmental Footprint
A rapidly growing aspect of technology, which heavily relies on the same GPUs or even more intensely than crypto mining, has received relatively less scrutiny regarding its environmental impact. We are referring to the AI revolution.
Generative AI tools, such as ChatGPT and Google Bard, heavily rely on GPUs – sophisticated computer chips capable of handling billions of calculations per second. Google, for instance, utilizes its own similar technology called tensor processing units (TPUs).
Sasha Luccioni, a researcher specializing in ethical and sustainable AI at Hugging Face, which has emerged as the conscience of the AI industry, emphasizes the need for more discussions about the environmental impact of AI. With Meta’s recent release of its Llama 2 open-source large language model through Hugging Face, the importance of considering the environmental footprint of AI becomes evident.
Luccioni emphasizes that to truly use AI for saving the planet, one must first address the environmental impact of AI itself. It would be counterproductive to contribute to environmental harm, such as deforestation, and then rely on AI to track and mitigate the consequences.
Calculating the Carbon Impact
Luccioni and several other researchers are attempting, with difficulty, to assess the environmental impact of AI. Numerous challenges hinder this process, including the reluctance of companies producing popular AI tools and the chips that power them to disclose energy consumption details.
The intangible nature of AI further complicates the accurate measurement of its environmental footprint. Unlike physical entities, AI’s ephemeral nature often leads to its exclusion from environmental initiatives or pledges. Even companies making efforts in sustainability rarely prioritize AI.
This ephemerality also extends to end-users. While we can readily perceive the harm caused by turning on our cars through visible exhaust fumes, AI’s impact remains largely invisible. The cloud-based servers, the chips performing processing tasks, and the substantial water usage to cool data centers, often go unnoticed by users.
Putting numbers to the problem
Let’s begin with the water consumption aspect. One academic study reveals that training GPT-3 utilized approximately 3.5 million liters of water through data center usage, assuming it was done using more efficient US data centers. However, if Microsoft’s data centers in Asia were used for training, the water usage would increase significantly to nearly 5 million liters.
Before integrating GPT-4 into ChatGPT, researchers estimated that the generative AI chatbot would consume about 500ml of water, equivalent to a standard-sized water bottle, for every 20 questions and corresponding answers. With the introduction of GPT-4, the researchers predicted that ChatGPT’s water consumption would likely escalate even further.