Technology and, increasingly, AI use ever-increasing amounts of energy. F-TAG explores the ethical and technical challenges in building sustainable AI systems.

‘Incentives need to be in place for everyone, from big tech firms building data centres, to SMEs consuming a lot of computer resources.’

‘It isn’t usually cost-effective for a business to optimise code performance unless they actually require it to run quickly and use fewer resources.’

We have all heard about how artificial intelligence (AI) and machine learning (ML) have been a huge growth area in the last decade. As more and more data becomes available, and more and more compute is required to process it - the environmental impact of AI and ML will only increase unless something changes. 

Data centres power the internet and our modern technical environment. They are currently estimated to use about 1% of the world’s energy production[1], which doesn’t seem a lot, but that number is only going to grow as the world becomes more and more technologically-enabled.

Using clean power

To put this in carbon terms, a data scientist training a natural-language processing model can take days and weeks across an array of specialist hardware. One key piece of research[2] shows that training such a model can have a similar carbon output to a transatlantic flight.

As many such activities are conducted in the cloud, a key reason for that is the degree to which Microsoft, Amazon and Google use clean or renewable energy to actually power their data centres. The same research shows that Amazon and Microsoft use less renewable energy than some nation states, for example Germany.

This is just the power consumption, it is also necessary to consider how the water used in cooling data centres is disposed of, and the impact upon the natural environment of building the data centres in the first place.

Drive towards sustainable AI

One aspect of the power consumption of AI is the hardware chosen to run it. Machine learning, for example, requires much lower precision arithmetic, and less “features” of a traditional computer processor.

For you

BCS members can read the very latest F-TAG technical briefings and reports.

More cost-efficient[3] that are designed to render video use less power, and can actually be more efficient for ML. Some companies put AI on application specific integrated circuits that are processors designed specifically for an AI task. 

So how can technology practitioners make their systems more sustainable? The UK Engineering Council’s Guidance on Sustainability provides many suggestions. Beyond those, these can be considered:

  • Companies building data centres can consider how they make it carbon-free. For example, one data centre was built next to a dam to consume hydroelectric power at source.
  • Developers can make sure their Data scientists and engineers may be tempted to retrain models very frequently to optimise performance - but is this really necessary? It might be better to focus on monitoring the performance and only triggering re-training when certain thresholds are met. On the other hand, the end-to-end sustainability benefits may outweigh the cost. If regularly optimising a model reduces fuel burn for aircraft flight planning, that may be worth the carbon omissions!
  • Infrastructure and Operations teams can monitor the resource usage of systems and constantly optimise this. As cloud offerings are continually enhanced there are often new and more efficient ways of delivering the same capabilities with less resources.
  • Transfer learning, an ML technique where an existing trained model is taken, and then it is modified slightly to meet the use case without full retraining, might significantly reduce the carbon emission for implementing systems with a high training overhead - for instance natural-language processing.
  • Edge computing, where ML models are run on internet of things devices or smartphones, can reduce the overhead used by those models by avoiding transferring data over the internet to a data centre.

So, in summary, there is a lot of energy used by technology, and an increasing amount of it is used by AI. There are definitely steps that everyone involved can take to ensure that they are minimising the environmental impact. However, these require trade-offs and difficult decisions.

It isn’t usually cost-effective for a business to optimise code performance unless they actually require it to run quickly and use fewer resources. Using transfer learning and edge computing doesn’t necessarily result in a better system either - in fact they may be less accurate. It is way easier to just retrain your model every day than put thought and effort into determining whether you should.

Incentives need to be in place for everyone, from big tech firms building data centres, to SMEs consuming a lot of compute resources.

Adam Leon Smith FBCS CITP is Dragonfly’s Chief Technology Officer.

References

  1. Masanet, E., Shehabi, A., Lei, N., Smith, S., Koomey, J., 2020. Recalibrating global data center energy-use estimates.
  2. https://arxiv.org/pdf/1906.02243v1.pdf
  3. Graphics Processing Unit