The paradox of AI’s environmental impact: what you need to know
- 20 mei
- Tineke van Heijst
- ·
- Aangepast 21 mei
- 45
This is blog 16 in the blog series about Green IT.
On 10 October 2023, an article was published with the fairly disturbing title (translated from Dutch): “Energy-guzzling AI: power consumption could equal that of the Netherlands four years from now”. The article was based on research by Dutch consultant and researcher Alex de Vries, who showed that if artificial intelligence is embraced throughout the world, its energy consumption will rapidly increase to equal that of entire countries – Ireland, the Netherlands or Sweden, for example. Yet despite this threat, AI can also contribute to the energy transition.
The two previous blogs discussed what AI is exactly and how it is used in the heritage sector. In this post, we will talk about the impact of AI on the environment. Because, although AI models offer remarkable opportunities for the heritage sector, this comes with a major price tag: AI already uses tremendous amounts of power, and that will only increase further over the coming years. Is the use of this technology worth the cost? And if we do choose to use it, how can we mitigate its environmental impact?
Anatomy of AI
In 2023, VPRO’s programme Tegenlicht presented a documentary titled De prijs van AI, or The Cost of AI, (1) which shows that AI’s success obscures its consumption of even more resources, even more data, even more computing power and even bigger server farms. When you zoom out, say the makers, ‘you see a hungry beast that needs to be fed with the very fastest chips, enormous data sets and poorly paid labour – in the form of silicon mines, endless rows of power-guzzling servers, and Syrian workers labelling data for the next generation of generative AI.’ AI, the makers conclude, is no divine machine: it is an industry fuelled by blood, sweat and metals. A system of extraction and exploitation on an industrial scale, with severe consequences for the planet and humanity.
The documentary refers to the project Anatomy of an AI System, founded by Professor Kate Crawford of New York University and Professor Vladan Joler of the University of Novi Sad’s Academy of Arts. (2) Their project shows that although artificial intelligence may often feel like a distant and abstract thing, it is in fact ubiquitous throughout our daily lives. Anatomy of an AI System analyses the enormous networks that support the life cycle of a single smart Amazon Echo speaker. The diagram showing all the elements involved in even a simple request like “Alexa, turn on the light” or “Alexa, what time is it?” covers an entire wall. This data visualisation shows just how many resources are actually needed for the production, distribution and, once the consumer is done with it, disposal of a speaker.
Though invisible to the users, this giant system is what it takes to answer all our questions and satisfy all our needs, and the immense computing power and memory capacity that its systems require consume huge amounts of energy. The next section of this blog focuses specifically on AI’s energy consumption in the training phase and the phase in which users interact with the system.
Training phase: the biggest energy guzzler
Before a smart assistant like Alexa is capable of understanding questions, it has to go through rigorous training. AI models need hours or even weeks of training to be able to perform optimally, during which the lion’s share of the energy is consumed by the graphics processors (GPUs) and computer memory (RAM).
The training of an AI model is an ongoing process in which trial and error plays a leading role. By building, refining and improving the algorithm, it becomes increasingly accurate in its predictions. The training phase has to be constantly repeated to keep the model up to date and to improve its understanding of various contexts.
The training phase of AI models is the most energy-intensive phase, which is why it is the primary focus of scientific research. During this phase, immense data sets are fed into the AI model (e.g. ChatGPT). The model’s parameters, which start out as random, are continually updated and refined. In this way, large language models learn to accurately predict specific words or sentences based on context.
To understand the sheer scope of this training, let’s look at a specific example. An average language model like ChatGPT is trained on terabytes of data, and has 175 billion parameters or more. The energy consumption of a single training session for ChatGPT causes 500 tons of CO2 emissions. (3) Meanwhile, HuggingFace’s BLOOM model (4) [1] uses as much power in a training session as 40 American households consume in an entire year (about 433 MWh). (5)
Inference phase: energy consumption through use
After their intensive training, AI models are moved to the production environment for the inference phase. For tools like ChatGPT, this means that the tool provides real-time answers to users’ questions, which further feed the model. Scientific research by people like Alex de Vries suggests that this inference phase plays a significant role in the total energy consumption of an AI model. (6)
According to Google, the inference phase was responsible for no less than 60% of the energy consumed by AI systems between 2019 and 2022. Corporations like Amazon and NVIDIA even estimate that the inference costs can be responsible for up to 90% of the machine learning costs of deployed AI systems. (7) The exact amount of energy required in this phase depends on various factors, such as the frequency of a model’s retraining and the delicate balance between a model’s performance and its energy consumption.
ChatGPT, BLOOM, Bing Chat and Bard (to name a few) have now been available to users for about a year, and already we can see that these tools have become indispensable, especially in certain sectors. In early 2023, ChatGPT was already answering some 195 million questions per day (8), resulting in a daily energy consumption of 564 MWh. That’s approximately 2.9 Wh per question – about the same as a LED lamp uses in an hour. And as the demand for AI continues to grow, so will the amount of energy consumed.
AI’s secret footprint
In addition to their staggering use of electricity, AI also uses vast amounts of water. A recent study by the University of Colorado Riverside and the University of Texas Arlington estimate that Microsoft’s data centres used 700,000 litres of drinking water to train GPT-3. A simple ChatGPT session of twenty to fifty questions easily costs half a litre of water – which may not sound like much, until you realise that this chatbot has more than a hundred million active users. (9)
This is a concerning trend, especially since the United Nations Environment Programme has forecasted that the proportion of people suffering from water shortages for at least one month per year will increase from one third to one half of the world population by 2030. (10)
It should be noted that the big tech companies are trying to find solutions to this issue. Amazon, Meta, Google and Microsoft are all striving to achieve water positivity in 2030, i.e. to restore more water than they use.
However, they are faced with a difficult challenge: their CO2 footprints and their water footprints are often diametrically opposed. To reduce CO2 emissions, you want to run your computing tasks while the sun is shining as much as possible. But that is also the hottest part of the day, which means you need more water for cooling.
In other words, a balance between the two will have to be found. And it will be necessary to rethink the geographic distribution of computing tasks, for example by training systems in locations that get a lot of sun, but have lower temperatures, like northern Scandinavia.
The study mentioned above is one of the first to investigate AI’s usage of water, so the results have yet to be fully substantiated. What is certain, though, is that more research is needed to better understand the impact of AI on water consumption.
What the crystal ball says about energy consumption
In his paper The growing energy footprint of artificial intelligence, (11) Alex de Vries calculated how energy consumption would increase if Google were to incorporate generative AI into every search operation. He found that Google’s energy consumption could rise to 29.3 TWh per year, approximately the same as the entire country of Ireland.
His conclusions are based on two studies. One, performed by SemiAnalysis, calculated that in order to optimise its search engine with generative AI, Google would need 512,821 Nvidia servers, containing just over 4 million graphics processors (GPUs). This would lead to an energy consumption of 80 GWh per day and 29.2 TWh per year.
The other study by New Street Research came up with similar figures: based on a requirement of 400,000 AI servers with a daily consumption of 62.4 GWh, they calculated an annual consumption of 27.2 TWh.
For the sake of comparison: the annual energy consumption of a Dutch household in 2021 was 2,500 kWh. (12) With the amount of energy that Google would be using, you could power between 25,000 and 32,000 households for an entire year.
How likely are these doom scenarios?
Both of these studies may be considered worst-case scenarios. Although the use of generative AIs is increasing rapidly, there are production constraints that limit the availability of the required hardware. And the financial investments required for a full-scale rollout of AI are so high that they would pose an excessive risk to the business continuity of even the largest tech companies.
Moreover, technological developments are moving fast. The tech sector realises that an unbridled growth in energy consumption is not feasible, and is therefore committing fully to efficiency. This includes improvements in hardware as well as in the design of AI models, the number of parameters with which the models are trained and retrained, the size of the data sets used and the speed at which the models carry out their tasks. Each of these factors affects energy consumption, which also explains why it is so difficult to realistically estimate current and future energy usage.
The role of hardware innovation
AI models are predicted to become more efficient in the future, but that brings Jevon’s Paradox into play (as we previously saw with the growing efficiency of data storage). In simple terms, this paradox states that as the efficiency of a technology improves, its usage increases as well. Ultimately, the growing number of applications results in an increase in energy consumption.
The world’s hopes are therefore pinned on hardware innovations. Corporations like IBM are researching how to make computer chips more energy-efficient, for example by using new materials or changing the production processes.
From a Green IT perspective, however, that creates another problem: when new, more efficient hardware becomes available, servers are replaced more quickly. This creates more e-waste, and requires the production of more new servers – and the majority of a server’s CO2 emissions occur during the production phase.
Or is AI part of the solution to the problem?
This blog would be incomplete if we only looked at the downsides of AI. In the first two blog posts on Green IT, we talked about how Green IT has two sides: the negative, which involves growing CO2 and water footprints, and the positive, namely the opportunities that AI offers to accelerate the energy transition and tackle the consequences of climate change.
The European Green Deal has therefore identified AI as one of the major spearheads in the battle against climate change. AI can help achieve the Green Deal’s objectives by rapidly analysing large quantities of data so that we can better understand environmental issues and take targeted action. AI provides faster, better insights that help inform planning, decision-making and monitoring of environmental policy.
AI can also contribute to the energy transition by monitoring and optimising energy consumption. The same goes for transitions in various other sectors, such as agriculture, construction and transportation, as well as the transition to a more circular economy. The report The Role of Artificial Intelligence in the European Green Deal lists countless examples. Commissioned by the European Parliament, this report also shines a light on the negative aspects of AI and calls for due consideration when implementing AI. (13)
In conclusion: a lot of research is still needed to make AI more energy-efficient. Luckily, those efforts are in full swing, and major steps forward are already being taken. Our next blog post discusses the research being done into Green AI.
Sources
VPRO Tegenlicht, 'De prijs van AI', aired on 8 June 2023.
Crawford, K. & Joler V, “Anatomy of an AI System: The Amazon Echo As An Anatomical Map of Human Labor, Data and Planetary Resources,” AI Now Institute and Share Lab, (September 7, 2018).
NOS Nieuws, ‘Kunstmatige intelligentie vreet stroom, één opdracht hetzelfde als een uur een lamp aan’, published on 31 May 2023.
The BLOOM model by Huggingface is a freely available language model that understands texts in 46 different languages. It was developed by the BigScience community.
Luccioni, A. et al., 'Estimating the Carbon Footprint of BLOOM a 176B Parameter Language Model', published on 3 November 2022.
Vries, Alex de, ‘The growing energy footprint of artificial intelligence’, In Joule 7, 2191-2194, 18 October 2023, Elsevier.
Desislavov, R. et al., 'Trends in AI inference energy consumption: Beyond the performance-vs-parameter laws of deep learning', In: Sustainable Computing: Informatics and Systems, Volume 38, April 2023.
Vries, Alex de, ‘The growing energy footprint of artificial intelligence’, In Joule 7, 2191-2194, 18 October 2023, Elsevier.
Li, Pengfei et al., ‘Making AI Less “Thirsty”: Uncovering and Addressing the Secret Water Footprint of AI Models’, arXiv: 2304.03271v3, published on 29 October 2023.
Guerinni, Federico, ‘AI’s Unsustainable Water Use: How Tech Giants Contribute to the Global Water Shortages’, published on 14 April 2023.
Vries, Alex de, ‘The growing energy footprint of artificial intelligence’, In Joule 7, 2191-2194, 18 October 2023, Elsevier.
Figures obtained from the Milieucentraal website: Gemiddeld energieverbruik in Nederland | Milieu Centraal, last consulted on 21 November 2023.
European Parliament, 'The role of Artificial Intelligence in the European Green Deal', May 2021.