Alex Bardell, Chair of the BCS Green IT Specialist Group, considers whether it’s time to rethink our approach to AI at a time of rising energy demands, increasing environmental concerns and growing interest in task-specific solutions.

The rise of large language models (LLMs) has been nothing short of revolutionary. These monolithic systems are designed to answer virtually any question, from legal precedent to pasta recipes. But, when it comes to considerations around their deployment to help us solve a plethora of business problems, are we using a mallet to crack a nut? As their complexity grows, so does their generality, raising the question of whether it's time to pivot toward more focused, efficient alternatives.

The case for small language models

Small language models (SLMs) offer a compelling alternative. Tailored to specific business needs, they require fewer data points, are easier to train and demand significantly less computing power. Their narrow focus means they excel at delivering relevant responses within a defined domain. A lawyer doesn’t need culinary advice, and a chef doesn’t need legal precedent. SLMs understand this distinction. However, their strength is also their limitation. SLMs are not fountains of universal knowledge; they can’t answer everything. But perhaps they don’t need to.

The SLM’s ability to focus solely on the task at hand potentially makes them a more frugal and targeted solution. Gartner predicts that by 2027 organisations will use small, task-specific AI models with a usage volume at least three times more than those of general purpose large language models LLMs. Sumit Agarwal, VP Analyst at Gartner, explains that ‘these smaller, task-specific models provide quicker responses and use less computational power, reducing operational and maintenance costs.’ 

Data centres and the sustainability dilemma

The environmental impact of AI is becoming harder to ignore. OpenAI’s commitment to investing billions in data centre capacity underscores the scale of the challenge. As we look to make environmentally friendly considerations around how to run our buildings, transport and manufacturing to combat climate change, AI’s growing appetite for energy threatens to outpace our ability to supply it sustainably.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Increased data centre construction has already led to rising energy prices in parts of the USA. Water usage is another concern, with some AI data centres consuming increasingly vast amounts of it, prompting resistance from local authorities.

To address these challenges, we must rethink where and how we build data centres. Locating them near renewable energy sources could ease grid bottlenecks. Cooling systems using seawater or heat exchangers offer promising alternatives. The Netherlands provides a model by repurposing waste heat from data centres to warm greenhouses and swimming pools, for example.

Rethinking AI hardware

AI development has largely piggybacked on GPU architecture — technology originally designed for graphics, not intelligence. As demand grows, we must ask whether it’s time to develop AI-specific computing architectures. While there are pros and cons for both approaches, and considerations to be made about potential inflexibilities of AI-specific architecture compared to GPU architecture, such purpose-built AI-specific systems could offer better performance and efficiency as demand grows

Defining efficiency in AI

Establishing a baseline for AI efficiency is no small feat. In traditional software development, non-functional requirements (NFRs) help measure performance under various conditions. Could a similar framework be applied to AI?

If we adopt a DevOps-like process — consistent test harnesses, defined scenarios — we could begin to measure and compare resource usage across models. The challenge lies in the diversity of platforms and frameworks. Comparing models built on different foundations is more like comparing apples to pears than apples to apples. There is continued debate and much work to be done as we work to balance AI promise and the drive to innovate with vitally important considerations around climate consciousness.

AI as a force for sustainability

Despite its resource demands, AI can be a powerful ally in sustainability. The BCS Green IT Specialist Group has long championed this duality. Case studies show promise: the Met Office uses AI for smart energy management and climate forecasting, while farmers leverage AI and IoT to manage soil and water usage more effectively.

To find out more about the BCS Green IT specialist group on their website.

Take it further

Interested in this and related topics? Explore BCS' courses and books: