As the number of alarmist headlines about AI’s energy usage grows, Ian Hodgkinson and Thomas Jackson consider what questions governments should ask when grappling with enabling sustainable artificial intelligence.

Across media outlets, we are seeing increasing news items and opinion pieces highlighting the energy intensive nature of artificial intelligence (AI). In the majority of instances, commentators across policy, industry and academia are outlining the threat AI poses to the global energy infrastructure and its wider negative environmental footprint. This includes both the use of finite natural resources, with applications such as cooling, and its associated greenhouse gas emissions, such as from data centres.

The ‘shock and awe’ generated is essential for raising awareness among the population — but if it is as bad as is being reported, why are governments not tackling it head on? The answer is that currently, they do not have the ability to measure the true energy costs involved.

Policy and government attention has been placed on the energy intensive nature of data centres, a key component in the AI infrastructure, and the need for enhanced efficiencies through innovation and technological advancement. Indeed, as part of the UK’s Industrial Decarbonisation Strategy, the Industrial Energy Efficiency Accelerator has driven highly efficient cooling technologies for data centres, with the aim of contributing to reaching net zero.

These actions are important, but the AI energy infrastructure is much larger than data centres and storage alone. Consequently, the future AI energy crisis is even more troubling than recent reports would suggest.

AI and energy consumption

Generative AI is a category of techniques and models that is of particular concern as, unlike traditional AI systems, generative AI models are trained to learn and mimic patterns from vast amounts of data. If we take ChatGPT as an example, data serves as the backbone for training such a sophisticated AI model and every piece of data undertakes its own energy consumption journey across network, compute and storage.

Data are critical inputs into training AI models and with the increasing use of AI in all walks of life, total data creation volume has been forecasted to explode over the next decade to 2,142 zettabytes in 2035. Within the data flows required to train generative AI, everything from the acquisition of data, the use of data in building training models, and the ongoing data storage requirements all generate CO₂. Yet, the environmental impact of data remains a hidden contributor to emissions from global energy use.

The broad areas of concern are :

  • Data sets: the concern with large data sets for Large Language Models (LLMs) lies in their significant energy consumption during training, which leads to a substantial carbon footprint. These models are designed to process and generate human-like textual responses by learning from vast amounts of text data. The sheer size and complexity of these data sets necessitate extensive computational resources and energy, contributing to environmental impact. Furthermore, duplicating such data sets further exacerbates these concerns, highlighting the need for sustainable practices in AI development
  • Dark data: the vast amounts of data used for building AI models can become forgotten once they have served their initial purpose. This accumulation can lead to a substantial volume of digital waste, consisting of Redundant, Obsolete, and Trivial (ROT) data. Storing this data over time not only consumes energy but also poses challenges related to data management and sustainability. This highlights the need for efficient data governance practices to mitigate environmental impact and optimise resource use in AI development
  • Data flow: models like GPT-3 are monumental in size and capacity, representing some of the largest language models developed with an extensive number of parameters. The impact of training these LLMs is a concern that often goes unnoticed. Incremental training models, along with their substantial data storage requirements and the CPU intensive nature of AI operations, collectively contribute significantly to the overall energy consumption involved in constructing and running AI engines. This underscores the critical need for sustainable practices in AI development to mitigate environmental impact
  • Carbon spillovers: the AI energy infrastructure is global, with critical components sometimes located in middle-income and low-income economies. The potential consequence of this global infrastructure is the creation of a new digital divide driven from where the energy hungry infrastructure is located. Shifting the carbon footprint so that it spills over from one country to another (for example, in pursuit of cheaper energy) is not a means to reduce a corporation’s carbon footprint

The energy consumption of generative AI is significantly higher than that of a typical end-user request, which is already quite substantial. Given this, what are governments waiting for? The AI lifecycle involves multiple stages, from model creation to end-user use, each contributing to a substantial carbon footprint. This complexity makes it particularly challenging to accurately measure AI's overall energy consumption.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

The challenge for governments

The central question is: who controls the narrative on AI's environmental impact? Responsibility is shared among all AI ecosystem actors. End users contribute to the data surge, companies introduce technologies that were previously unforeseen, data firms often lack transparency, and governments struggle to develop effective metrics for policy and regulation.

As a result, governments face significant constraints in addressing AI's environmental impact, leaving them largely hamstrung. This may explain why, in a recent poll of G7 countries, the ‘sustainability development and energy use’ of generative AI ranked joint last among 13 priorities highlighted in the OECD AI Principles on degree of urgency. This ranking contrasts sharply with the ‘shock and awe’ narrative often presented in the media.

What the science says

Every unit of data may consume varying amounts of energy throughout its lifecycle, including during the time it spends traversing a network, throughout processing, and its storage across different mediums. However, the lack of consensus is troubling as such a calculation will be critical in enabling governments and policy bodies to set AI policy benchmarks and manage the environmental impact of AI into the future.

Increased transparency from industry leaders and standardised reporting practices driven by public administrations and policy bodies are essential. These measures would significantly help in developing a clearer picture of the environmental impact of our data driven world and in implementing responsible AI to achieve net zero emissions by 2050.