Martin Cooper MBCS slips on his winter coat and visits the Stellium data centre in Newcastle-upon-Tyne, a place famous for football, bridges and — it transpires — highly desirable levels of rain and cold.

There’s never been a better time to be a technology commentator. Right now, we have an excess of epic, epoch defining, paradigm shifting technologies to wrestle with. There’s AI, with its attendant ethical issues; cloud, with its constantly expanding list of technical possibilities; and quantum, with its ‘tell me that again, but slowly’ questions. These grand scale, enabling technologies offer huge scope for investigating what, why, when and — certainly in the case of quantum — how.

There is one class of enquiry you don’t see asked too often, though: where are the questions? And, for the first two technologies — AI and the cloud — the question of where they’re located is becoming critically important. Part of the answer is, of course, in data centres. Data centres provide the heart like pumps that keep the cloud, the internet and AI alive, flourishing and growing at tremendous rates.

Where we build data centres is a complex issue. They can be air hangar big, consume vast amounts of — ideally — sustainable electricity, need super high speed connections to the internet and generate waste heat that needs to be dissipated, hopefully without requiring the input of even more fresh energy. Finally, there’s another location requirement: security.

Regarding where data centres should be built, Stellium has a seemingly contrarian answer: not in Silicon Valley, Virginia’s Data Centre Alley, nor even the frigidly cold nordic regions. Instead, Stellium has set up a home in Newcastle-upon-Tyne, on the site of a once promised silicon chip foundry that never materialised.

The climate

There, not far from the sea, it has built the first phases of what is already one of the UK’s largest purpose built data centre campuses. The facility is designed for colocation, an approach to data centre building that sees organisations place their hardware and servers inside data centres owned and managed by a third party. The Stellium facility currently boasts four 1,066m² data halls that are gradually filling up.

The first reason Newcastle is an ideal home for a data centre is the weather, explains Jonathan Evans, Director of Total Data Centre Solutions. Newcastle isn’t famed for its balmy climate, which is a very good thing if you’re in the data centre business.

‘Last year — 2023 — in London, temperatures hit 32 degrees in June,’ he recalls. ‘At the same time, they didn’t go above 22 degrees in Newcastle.’ London’s record temperatures, Jonathan says, caused data centre operators based in the capital to experience major cooling issues. The Stellium facility, however, stayed well within design tolerances and, as such, required the input of very little extra energy for cooling. This is good for electricity usage and carbon outputs.

A question of power

Next on the list of answers to ‘why choose Newcastle?’ is a big one: electricity. The data centre has two connections to the 275kV Tynemouth National Grid. It has two rings because if one fails, the other should still work. As we shall see, this duplication of hardware, called redundancy, is a constant theme of data centre design.

Returning to the high-power electrical links, each of these resilient diverse links provides 40MW of electricity and is designed to scale up to 180MW.

‘We’ve never had an outage,’ explains Edward Bissell, Sales Director at Stellium Data Centres.

The availability of electrical supply is only half the story, though. ‘The supply is 100% renewable,’ says Edward. ‘We’re connected to the world’s largest offshore wind farm.’

Located 130km off the northeast coast of England, the Dogger Bank Wind Farm can power 6 million British homes according to the CBRE North of England Data Centre Study. The wind farm also plays a considerable part in seeing the Northeast boast a highly sustainable energy mix relative to the rest of the UK. Indeed, the Northeast has electricity with the lowest carbon intensity in the UK, with figures of 25g CO₂/kWh.

The UK has 517 data centres. London accounts for 77.3% of this market. The capital, however, relies more on gas for power generation, and this sees its electrical supply soar to a carbon intensity of 185g CO₂/kWh (National Grid figures).

Topping off the campus’ power system is a proposal to deploy a battery storage and solar system to further reduce the data centre’s carbon footprint. The company is teaming up with Total Data Centre Solutions (TDCS) for the project, which will deploy a 2.3MWh battery storage setup and a rooftop solar photovoltaic system. Through the project, TDCS suggests that it may be able to reduce the data centre's carbon footprint by around 125.9 megatonnes of CO₂ per annum.

TDCS also suggests that the solar and battery project will reduce the data centre’s electrical power consumption by around 380,000kWh.

Walking the halls

So, what’s it actually like inside a data centre? The first thing to know is that it’s highly secure. This means driving to it feels more like approaching a prison, albeit a luxury one. The facility has high, black twin fences and an undisclosed number of security cameras monitored by a residential team and a remote company. Approaching from the road involves stopping at a security hut and a set of gates and navigating a chicane designed to stop ram raiders.

You’ll find bullet resistant glass and steel lined walls if you make it to the building. Finally, there’s an ID document check and an airport style scanner booth into which you step. If your name’s not on the list, you’re not getting in, and when you are inside, there are no photographs.

Walking around inside the building makes you feel small. That’s because the building is huge, and the doors have all been modified to be 2700mm high. It feels like you’re walking in the land of giants. Similarly, the lift is taller and beefed up to lift 3,500Kg — a standard office lift might be rated to around 1,000Kg. Finally, there are no access ramps — everything is level.

Designing for efficiency

These physical design elements make it easier to move server racks around without tipping or angling them. These design elements are part of the Open Compute Project (OPC) requirements.

OPC is an open source and open collaboration project focusing on all data centre design aspects. It has its roots in Facebook and how, in 2009, the social media firm realised it needed to rethink its infrastructure accommodation to control costs and energy consumption.

A small team of Facebook engineers spent the next two years designing and building an energy efficient data centre from the ground up: software, servers, racks, power supplies and cooling.

This work became the acorn from which OPC grew. The project now focuses on finding efficiencies near and around data centres’ networking equipment, general-purpose and GPU servers, storage devices and appliances and scalable rack designs.

Given the massive rise of AI, OPC Ready is important because it certifies a data centre can provide colocation services capable of supporting high power compute (HPC), artificial intelligence and machine learning workloads.

Stepping into a hall

Each Stellium OPC ready data centre hall is over 1000 square metres, and lines of floor to ceiling high cages are inside each. Inside these cages sit the colocation clients’ hardware in racks. The cages are locked and are generally only accessed by the hardware owner’s staff. The ceiling is lined with elegant bundles of power cables and networking. These cables are all bound together inside bright yellow, square section fireproof conduit. Call bus bars (overhead power supplies) can supply 32, 63 or 100 amps down to the humming computers.

When you enter the room, it is momentarily black, with distantly blinking LEDs the only light source. Then, as the movement sensors detect you, the lights turn on, and you’re presented with a highly ordered collection of squares, rectangles, greys, blacks and whites. It’s all very angular, monochromatic, ordered and clean. The conduit’s yellow is the only real colour.

The way a data centre looks is, however, secondary. The way it sounds is dominant. Along with moving data about efficiently, the centre is a hymn to cooling. And this hymn drones, whooshes and hums to the point that earplugs are always on hand.

Along with being many shades of grey, data centres are also love affairs with symmetry and duplication. ‘Each hall has two lines of Computer Room Air Conditioning units or CRAC coolers,’ explains Stellium’s Paul Richardson.

Lines of ten coolers plus a spare are on each side of the hall, situated outside in the support corridors. They push cold air into the room and under its hollow floor. Importantly, Paul says there are no cables or obstructions under the floor — it is a clean, continuous, empty void. As a new cage is added inside the hall, floor tiles can be removed under the new racks and replaced with grilles.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Not all data centres place their CRAC units outside the halls, but here they are for a specific reason. It’s not unheard of for large clients to lease a whole hall, and when they do, they sometimes bar the data centre owner’s staff from entering. Keeping the cooling infrastructure outside the hall means the CRAC units can be maintained while the client keeps their colocated kit private and accessible to only their staff.

The CRAC units do a lot of work. The hall must be kept at a constant 24 degrees and no lower than 40% humidity. Controlling the humidity is essential because, as humidity drops, the risk of static electricity build-up increases. This means a CRAC unit has a chilled water filled cooling coil to remove heat, a humidifier, a dehumidifier and an air filter.

The CRAC units pass the heat into a water ring. The ring encircles the data halls and eventually goes to giant cooling chillers outside the building. Generally, the famously and predictably cool Newcastle breeze sufficiently cools the warm water for a return circuit. There are electrical fans in the chillers and compressors if more cooling is needed.

‘It’s basically like a giant car radiator,’ says Paul, downplaying the scale and unbelievable neatness of the whole affair.

Along with loving grey and symmetry, data centre designers are also eternally pessimistic. Every possible failure imaginable is carefully catered for. Along with a spare pair of CRAC units and two power supplies, each hall has a battery based Uninterruptible Power Supply (UPS) system.

‘Each hall has two UPSs. Each UPS has 512 lead acid batteries,’ Paul says.

In the event of a power outage, the batteries are, Paul explains, really just a buffer. They keep the data hall running while the data centre’s diesel generators start up and balance themselves. Each generator has a massive tank of biodiesel ready to be used. And, again, each generator and tank has a backup.

‘Each tank holds 40,000 litres of biodiesel and can last for two days,’ he says. This biodiesel is cleaned regularly.

Connectivity

With electricity, redundancy and security covered, one more data centre essential must be discussed: a fast connection to the internet. And, if data centres create the internet’s heartbeat, subsea cables are the web’s arteries — the tubes and tunnels through which data moves at incredible speed. In the main, these data cables are, of course, fibreoptic. 

Unsurprisingly, subsea cables tend to land on the coast because the seas surround the United Kingdom. The largest concentration of cables land in the South West at places like Brean, Highbridge, Bude and right at the tip of Land’s End in places such as Porthcurno and Whitesands Bay. In broad terms, most cables land in the Southwest because London, the home of most UK data centre capacity, is comparatively close.

The website www.submarinecablemap.com beautifully illustrates the location and landing places of deep sea cables, showing that the places we’ve mentioned connect the UK to the US, Europe and Africa.

However, the south and west of England don’t have a monopoly on intercontinental data connectivity. According to CBRE, ‘more subsea cables offer direct connectivity to the North of England. For example, the North Atlantic Loop cable system installed by Aqua Comms consists of two trans-Atlantic cables, AEC-1 and AEC-2, which connect from Newcastle — using the Stellium Data Centre Campus — to Northern Europe and throughout the UK.’

Other critical cables include:

  • North Sea Connect: this directly links Stellium Newcastle to Blaabjerg in Denmark
  • NO-UK: this cable connects the Stellium Data Centre to Norway
  • CeltixConnect 1: this provides a secure crossing across the Irish Sea from Dublin to Anglesey. CC-2 runs between Dublin and Blackpool and has two landing points on the Isle of Man.

All this means, in essence, that Newcastle connects both east and west: east to Europe and west to New York and out into North America. This centrality and the fact that different communication networks effectively meet in Newcastle saw Stellium launch a new internet exchange (IXP) in 2021.

Based in the Stellium campus, NXL-IX (Newcastle Internet Exchange) is a junction point where the internet’s most extensive networks meet (see boxout).

Uniquely, the cable terminates at a landing station within the highly secure data centre compound. It’s more common for cables to land on remote beaches before continuing an onward journey to a data centre, often in trenches next to railway tracks. 

If your business stores, processes and moves vast amounts of data, being close to an IXP is essential. Indeed, in the north of England — around the Manchester IXP — there’s a concentration of cloud service providers, media firms (BBC, ITV and Sky), public sector organisations and universities.

Beyond its wind and rain, there is a great deal to recommend Newcastle, particularly if you’re a cloud, AI or HPC company. The city has abundant green energy and access to fast internet infrastructure. The north of England also benefits from relatively low land costs. These things, CBRE suggests, are things London — the traditional home of data centres — is finding hard to match. Space in the capital is in short supply, and spare megawatts of electricity aren’t easy to come by. So, if you’re a hyperscaler — the internet’s most significant users — you might need to buy a raincoat.

What is an Internet Exchange?

An Internet Exchange (IX) is a crucial component of the global internet infrastructure that enables the exchange of internet traffic between different networks, such as Internet Service Providers (ISPs) and content delivery networks (CDNs). The primary purpose of an Internet Exchange is to enhance the efficiency and speed of data transmission by enabling direct interconnection between networks, thus reducing the need for data to travel across multiple intermediary networks. This is achieved through the physical interconnection of network infrastructure at a central location, known as a Point of Presence (PoP). Internet Exchanges are vital in minimising latency, increasing bandwidth capacity, and reducing operational costs for participating networks.

In the United Kingdom, major Internet Exchanges include LINX (London Internet Exchange), one of the largest and oldest exchanges globally, and IXManchester, serving the northern part of the country. These exchanges contribute significantly to the robustness and efficiency of the UK's internet infrastructure.