As industries harness its power to reduce waste and optimise energy use, AI’s own carbon footprint raises urgent questions about its long term impact. Here, Sarah Burnett FBCS considers AI as both a catalyst for sustainability and a growing environmental concern.
The growing use of AI has brought us to a critical juncture in our history, posing one of the most compelling questions of our time.
AI offers great potential to accelerate global sustainability endeavours by improving energy efficiency across many industries including manufacturing, telecommunications and pharmaceuticals. On the one hand, when directly applied to optimise the use of heat, power and water within heavy industrial processes, AI delivers immediate and substantial efficiency gains. It also contributes indirectly but powerfully when applied AI to streamlining complex business operations, modernising processes, and automating decision making.
As shown by a growing body of evidence that demonstrates concrete improvements such as major reductions in material waste, greenhouse gas emissions and electricity consumption, these use cases are not just academic.
On the other hand, the very engine that is driving these advances has an insatiable and increasing appetite for energy. The rapid growth in demand for AI, particularly the proliferation of the huge data centres that are needed to power it, has put the focus squarely on the technology’s own environmental footprint. The substantial power consumption associated with training and deploying large scale AI models makes it very challenging for us to assess the overall net effect of the technology on global energy use.
And here lies the dichotomy: is AI’s impact a net positive on the environment, or are the efficiency outcomes that it delivers simply masking a deeper energy paradox? The giants of the industry are adopting innovative solutions such as advanced liquid cooling, waste heat recovery and carbon-aware computing to lessen their environmental impact. Yet the true balance between the energy savings enabled by AI and its own burgeoning consumption remains a mystery.
The promise: AI as an engine for efficiency
The positive environmental impact of AI is achieved in its application in two broad categories; direct and indirect. Direct applications involve using AI specifically and actively to manage and optimise physical systems, such as heat and water usage, in heavy industries like oil refineries and water treatment plants. Indirect applications focus on improving the efficiency of business processes, digital infrastructure and human productivity through augmentation with AI.
Direct applications in physical systems
The most visible benefits of AI in sustainability come from its direct integration into industrial plants and the underlying technology infrastructure. AI can analyse vast sets of real-time data from sensors in order to make real-time adjustments, achieving a level of control and optimisation that human operators or traditional automation systems cannot equal. Examples include:
- In the pharmaceutical industry, companies are using AI to make many micro-adjustments to their production processes. Consequently, they are able to reduce batch failures and better optimise resource use. Some are moving from fixed schedule cleaning cycles to smarter approaches where algorithms analyse production schedules and manage cleaning processes accordingly.
- In the energy sector, AI is crucial for managing the modern ‘smart grid’. It helps balance the fluctuating supply of renewable sources like wind and solar with real-time demand. AI is used to predict energy consumption patterns, allowing utility companies to prevent blackouts and reduce reliance on carbon-intensive plants.
- In logistics and transportation, companies have used algorithms to optimise delivery routes for a long time. Today, the machine can learn from data including past delivery times, traffic patterns, driver behaviour and weather patterns to work out the best routes and make changes in real time.
Indirect applications in business and digital processes
While harder to quantify in terms of direct energy savings, the indirect environmental benefits of AI are significant. By optimising some workflows and automating others, AI contributes to a more resource-efficient economy.
Process intelligence (PI) has become one of the biggest levers for office based operational efficiency in recent years. Powered by AI, PI tools gather data from software applications and employee actions on screens and keyboards in order to build virtual models of business processes. AI is used to identify patterns and join the dots to build a complete picture of how work gets done in the organisation. The insights allow companies to identify hidden process bottlenecks and other points of friction that hamper employee productivity and to enable them to address the issues. Today, operation leaders can even ‘talk’ to their processes through generative AI; the PI data is fed into large or small language models which can then answer questions about the issues and how to overcome them.
A good example is that of a contact centre provider that uses PI to boost productivity. This is ongoing, but when they first deployed the technology they were able to improve their processes almost immediately to increase productivity by 18%. Additionally, they reduced agent idle time by 8%. These were only the early outcomes that the company achieved. Since then, the business will have gained much more from this intelligence once employed as part of a continuous improvement cycle.
While the primary goal was operational efficiency, the secondary benefit was reduced energy consumption by idle workstations that were on but not being used. Scaled across thousands of employees, these incremental savings become significant. The same principle applies across industries, where AI can identify bottlenecks and idle systems in supply chains, financial processes, or software development, leading to faster, smoother operations that consume fewer resources over time.
Another dimension to this is AI augmenting humans; perhaps an LLM can get someone started on framing an argument on how to go about a project, or it could accelerate the production of marketing material. Some estimates suggest that there are over one billion information or knowledge workers in the world — people who use information or knowledge to deliver services to their employers. If, thanks to AI augmentation, they each save 10 minutes of work everyday, finish on time and switch off their machines, then that is 10 billion minutes of PC time saved across the globe which could add up to substantial energy savings. For this theory to work, IT departments would have to update their always-on PC practices.
Dematerialisation is another indirect way that AI can help by accelerating the shift from physical to digital. For example, AI-powered design software reduces the need for physical prototypes. Instead of building multiple physical prototypes from materials like wood, metal or plastic to test a new car part or building design, designers can create and test countless iterations in a virtual environment and rotate the object around to view it from all angles.
Each of these shifts carries a significant environmental benefit by reducing the energy consumption and materials required for manufacturing and transportation.
The cost: AI's sobering energy and resource footprint
Despite its potential, the computational infrastructure underpinning AI is one of the world's fastest growing consumers of energy.
Firstly, AI runs in data centres; vast, industrial-scale facilities that are filled with servers, storage systems and networking cables. According to the International Energy Agency (IEA), global data centre electricity consumption was an estimated 415 terawatt-hours (TWh) in 2024. IEA’s projections for 2030 are more than double that figure, ranging from 945 TWh to over 1,000 TWh. The figure is comparable to the entire annual electricity consumption of a small but industrially advanced country. AI is a significant driver of this explosive growth.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
Secondly, the energy is not just for computation but the cooling systems needed to prevent the densely packed servers from overheating. Many data centres use evaporative cooling towers that consume enormous quantities of fresh water. Some estimates suggest that a single large data centre can consume as much water per day as a small city. If the figures are accurate, it would mean a significant strain on local water resources, especially in arid regions.
Thirdly, training a large foundation model involves feeding it billions of data points on many specialised processors (GPUs or TPUs). The carbon footprint of training a single large model has been estimated to be equivalent to hundreds of transatlantic flights. However, the cumulative energy cost of inference — the process of using the trained model to answer queries or perform tasks — is arguably greater over the model's lifetime, though figures vary as to how much more energy a simple query to a sophisticated LLM consumes compared with a standard Google search. Some estimates put this as 10 times more, while others just a fraction, thanks to the LLM providers efforts to optimise energy consumption.
The environmental impact of AI also extends beyond electricity and water consumption to the production of specialised hardware such as the GPUs that are the workhorses of AI. Their manufacturing requires rare earth metals, a lot of energy and pure water. The rapid pace of AI innovation is speeding up hardware refresh cycles too, resulting in more waste.
Bridging the gap: the path to sustainable AI
Sustainable AI efforts focus on developing models that are less computationally expensive. Researchers are working on new model architectures and the use of smaller, highly optimised models trained for specific domains are starting to gather momentum, replacing monolithic LLMs. These ‘small language models’ (SLMs) can perform specialised tasks with a fraction of the energy footprint.
Other measures include capturing the heat generated by data centres to warm nearby homes, offices and greenhouses through district heating systems. Carbon-aware computing is another development which shifts non-urgent computational tasks to run when, or at locations where, renewable energy is abundant on the grid.
The imperative of comprehensive study and regulation
Although market forces and technological innovation are at work, they alone may not be enough to answer the question of AI’s net impact on the environment. Transparent reporting and smart regulation are essential to lead the industry. Macro-level studies are needed to show us the balance between productivity gains against AI’s full lifecycle costs, including hardware production, operational energy use and waste. This data is crucial for informing policy decisions around sustainable technological expansion.
Regulatory frameworks such as the EU AI Act include provisions requiring developers of general purpose AI systems to disclose their energy consumption and resource use. Such measures lead to accountability and create market pressure, allowing consumers and businesses to choose more sustainable AI providers.
While AI’s deployment in industry and infrastructure drives both significant efficiencies and growing energy needs, its net environmental impact remains unknown. Continuous breakthroughs in hardware, algorithmic efficiency and sustainable facility design are essential. However, just as critical is a transparent and global research effort to measure and understand the net effect of automation and digitalisation, ensuring that this powerful technology serves as a solution for our planet — not a new problem.
Take it further
Interested in this and similar topics? Explore BCS' books and courses: