Martin Cooper MBCS RITTech talks to leading telephony experts and explores what a fifth-generation mobile network will mean for consumers, businesses and for robots.

There’s a revolution coming. It’ll happen slowly - possibly over decades. But, if its instigators get their way, this new technology will be the backbone that enables advances such as smart cities, driverless cars, remote controlled operating theatres, automated farms and more besides.

It’ll be the glue that’ll make the internet of things a large-scale reality and it’ll become the spine of blockchain-based asset management systems. This technology is, of course, 5G or the fifth-generation mobile network.

Just how big will 5G be? ‘It’s the biggest thing in the communication technology scene right now,’ says Ian Keene, Research VP at Gartner.

‘I think 5G will be revolutionary,’ says Professor Andrew Nix - Head of the Communications Systems and Networks Research Group at the University of Bristol. ‘Certainly, in the circles I work in it’s been referred to as another industrial revolution, after the internet.’

5G isn’t, however, here yet - as a technology it is expected to be launched commercially in 2020. As such, thus far, the public has only seen glimpses of 5G. A trial of the technology was completed successfully during the 2018 Winter Olympics in South Korea. Closer to home, UK cities, including Bristol and London, are the sites of trials too.

The fact that 5G is still incubating hasn’t stopped hype from building around the technology. And it’s for this reason that ITNOW set about investigating 5G thoroughly. What is it, how does it work and what will it mean for businesses and consumers?

The core 5G promise

In some regards 5G sets about addressing the perceived inadequacies of 4G - our current mobile phone system. On paper, 4G should be brilliant: a theoretical maximum download speed of 75Mbit/s should - in theory - be more than enough for most content delivery and industrial applications.

Only, as we all know, 4G isn’t always perfect. Jump on a train, turn the wrong corner or stand under the shadow of the wrong tree and you’ll be left with a broken connection. That fanciful figure of 75Mbit/s is very much a best-case scenario. All that said, when you’ve got access to a clear and strong 4G signal, the technology makes the mobile internet a reality for millions of users.

But what about 5G? What is its unique selling point? At its most reduced, 5G has, according to Howard Jones, Head of Communications at EE, one key promise: ‘The perception of infinite capacity.’

When the technology has matured, users should have access to huge volumes of data, delivered at great speed. This network should be immune to the blackspots and dropouts that bedevil 4G.

Critically, a 5G network should also operate with very low latency or with a very low ping rate. Users and devices shouldn’t have to wait long for that wide bandwidth and fast data transfer. This all adds up to that promised perception of infinite - and immediate - capacity.

It’s about machines and not video

As you read on we’ll delve into how 5G, as a technology, goes about achieving these core promises. Before we go any further though, it’s worth addressing a key misconception. Unlike 3G and 4G, 5G isn’t about delivering crisper high-definition video to your mobile phone. It’s about much more.

For starters, consider 5G’s potential bandwidth which can, in the right circumstances, run into hundreds of gigabits per second. That’s far more than is necessary to deliver hi-definition video to a mobile phone.

Indeed, you have to ask yourself: is 4K video on a mobile phone really that necessary when you have such a small screen? Can you really appreciate the clarity? Would you, as a consumer, pay your mobile provider for a data pipe so big you’ll never really tax it? The answer is likely to be ‘no’ - according to Rick Chandler, chair of the Communications Management Association (a BCS specialist group).

When evaluating 5G’s potential to create change and disruption you also need to consider the adequacy of today’s best 4G phones. ‘I think most people would find it hard to argue that their £1,000 4G phone isn’t good enough for an awful lot of tasks,’ says Professor Nix.

Discounting coverage gripes, 4G is largely good enough for most of today’s common tasks: social media, video, news, navigation and distracting your children in a restaurant.

Turn your eye to the internet of things, AI and to robotics, and suddenly, the use-case for 5G makes sense. Bandwidth, through-put, predictability and low latency need all to exist in abundance if we’re to make these technologies industrial-scale realities.

In a 5G world it would be possible to manage factories remotely in real time. 5G is also a key ingredient in making driverless cars a reality. Without a dependable data connection, autonomous vehicles would be downright dangerous. If you arrive in a radio blackspot, chaos would ensue.

Think about shipping and imagine a world where containers would be tracked in real-time as they traverse the globe. It’s hard to understate 5G’s potential.

Gadget geeks who are eyeing 5G as the next must-have may well be disappointed - in the short-term at least.

So, what is 5G?

The first thing to understand about 5G is it’s not a single new technology in the way, say, a new Intel or AMD processor might be. It’s not a single product. Rather it is a collection of new technologies all drawn together under the umbrella of 5G.

‘5G is often described as a network of networks, and that really is the case,’ says Prof Andrew Nix. ‘The core network is being developed to be software programmable, flexible and more importantly, [the network] connects to multiple radios. It is a heterogeneous network.’

This means, he explains, that your device - be that a phone or a tablet - won’t use just one radio. Currently 2G, 3G and 4G operate on different frequencies across a range from 800MHz right up 2,600MHz. 5G, by comparison, will use the millimetre wave bands. Today, all cellular and Wi-Fi services operate at frequencies below 6GHz. 5G is introducing new millimetre wave (mmWave) bands at 26GHz and 60GHz.

‘The millimetre bands are interesting because of the bandwidth they create,’ says Professor Nix. Potentially, he says, the millimetre range could provide more data carrying capacity than the sum of the entire existing range of mobile radio frequencies. It is, he says, a huge amount of additional radio bandwidth.

There’s a catch though. Radio signals in the millimetre spectrum of frequencies don’t travel well and they don’t travel far. In practice they may only be able to manage 100 to 200 metres from their base stations. This, means, as we shall see, the nature of 5G radio infrastructure will be different from 4G’s. For example, 5G network cells will be much smaller than today’s and we’ll need more masts. But, happily, these masts will be much smaller.

But, returning to his positive analysis, Professor Nix says: ‘The bottom line is, the millimetre waves are what give you the headline numbers for people who like to talk about big data rates. I guess 10 gigabytes a second might be an aggregate capacity that could be achieved on a radio carrier. I wouldn’t suggest that 10 gigabytes is what a single user would need, I think most projections are that individuals’ users rates probably range from between a megabit per second and a hundred megabits per second. But, if you’ve got very high data rates you can share that radio capacity between large numbers of users.’

To put all these numbers into some kind of context, Professor Nix says: ‘On a 4G network, if you’re trying to get a hundred megabits per second you’ll take up the entire capacity of the south [of England]. With 5G it would be possible to have potentially hundreds of people all operating at a hundred megabits per second.’

In areas where there’s a very high density of people using mobile phones this, in reality, might not translate into a higher peek data transfer rate on a 5G network. What it may create is a situation where a higher number of people will be able to access more data at a higher aggregate speed.

Today, even on 4G networks, when huge numbers of people try and access data capacity falls away. 5G should address supply being reduced by higher demand.

Certainly, the new millimetre wavelengths are a key component in achieving the impression of infinite bandwidth, but they’re not the only component. When they become available, 5G mobile phones will also utilise 4G, 3G and maybe even GSM to ensure coverage even in the most rural areas. ‘It is this intelligent combination of all those radios managed as a single [radio] to the user that I think 5G is particularly trying to address,’ says Professor Nix.

This is a little more clever than it might appear. Current generation mobile phones generally scan for and pick the best available signal, be that 4G, 3G, GSM or, if it is available, Wi-Fi. ‘5G will intelligently monitor all of those available radio pipes and, rather than switching between them, many of those pipes will be available simultaneously... Data will be combined through multiple radio links in order to give you a much more reliable and seamless service.’

Achieving low latency

Along with the drive towards ultra-reliability - no loss of network, no breaks in service - 5G also targets low latency. Today’s 4G or long-term evolution networks have a response time of 15-100 milliseconds when things are going well. If things are going badly it can be up to a second.

‘When it works, 4G is impressive,’ says Professor Nix. ‘The response time of the brain is 200m milliseconds so it’s quicker than I think. But it’s not quick enough for industrial automation of robotics. If you’ve got a robot cutting out large bits of metal, you need a latency better than 15 milliseconds in order to control it. If I’m driving in a car that is relying on connectivity to support the autonomous features or to potentially inform the driver, I need better latency than we have today.’

5G, Professor Nix says, will improve latency by a factor of 10 - down to maybe five milliseconds. For specialist applications, one millisecond is a possibility. That transforms what can be done in cities, automation and manufacturing. ‘By delivering [ultralow latency] we will open up new services, new revenue streams and opportunities. 5G is being designed more to connect machines than it is to connect people because we are envisaging potentially trillions of connected devices.’

As a closing observation, he says: ‘The idea of having sensor networks and having items and machines talking is a key part of 5G and it cannot be done at scale on any current network that we have today. Speed gets all the headlines, but I think it’s the reliability and the latency that open up new services and new services open up monetisation opportunities that encourages people to invest the tens of billions, if not hundreds of billions, in the network rollout.’

Engineering ubiquity

As we’ve seen, millimetre wave lengths are both a blessing and a curse. On one hand they offer the potential for hugely wide bandwidth but, on the downside, signals of this type degrade quickly and so don’t travel very far. As a family of technologies, 5G is alert to this problem and has a solution - a solution that should allow it to maintain speed, low latency and dependability.

The solution is called multiple input, multiple output or MIMO. MIMO is an antenna technology for wireless communications in which multiple antennas are used at both the source (transmitter) and the destination (receiver). The antennas at each end of the communications circuit are combined to minimise errors and optimise data speed. MIMO itself isn’t a new idea - among other things it’s supported by 802.11n Wi-Fi standard.

‘So, with millimetre wave, the big thing to overcome is the high losses and this is done using multiple antenna elements to form beams. Narrow beams that track you,’ says Professor Nix. ‘And I think the beam-tracking technology is very impressive. We’re just starting to trial it now. We’re very lucky in Bristol, that we have government funding to do 5G trialling, so we have an experimental 5G network set up in the centre of the city. Beam tracking has the potential to transmit gigabits plus data rates into devices and into cars. We have a project with BT and Jaguar Land Rover looking at bi-directional multi-gigabit connectivity on motorways and on the street into vehicles.’

MIMO, or massive MIMO as its often referred to within a 5G context, has the potential to be massively transformative. A few years ago, Bristol University managed to set a world record for the greatest amount of data sent over a finite radio bandwidth. ‘We basically managed something like 20 times more data in a given amount of bandwidth than could be done with the best 4G network.

We had a base station with 128 antennas in it. Now the good news is they’re relatively small antennas, they’re quite compact. But, I was recently in China and visited a university over there where I saw a 1,024 element array the size of a billboard, which blew my mind. And the Chinese government is looking to deploy these in the major cities in China. I think about densification and the challenges of large numbers of people living in cities and I guess I shouldn’t be surprised that China has declared that it will be the global leader of 5G.’

Taking your slice of the network

Thus far, we’ve talked about how 5G will provide us all - machines included - with that sense of unlimited bandwidth. 5G also acknowledges, in some circumstances, a promise isn’t enough. Some industrial sectors need a guarantee of enough bandwidth to get the job done. This idea of a guaranteed level of service forms the basis of network slicing - another core 5G technology. The concept, from a telco’s perspective, is simple: providing a customer with a dedicated piece of the spectrum.

‘It’s very attractive from an operator’s perspective because it enables you to monetize your network capability in a very different way,’ says EE’s Howard Jones. ‘It’s very attractive to industries too because they’ve never previously had the ability to have a guaranteed quality of service on a cellular network - a network with all of the other benefits that come with cellular, around security and identity, and presence around provisioning and those aspects. You’re effectively changing the capability of cellular from an almost best effort technology to something that’s got guaranteed quality of service attached to it.

Who might use such a technology? Healthcare could be one, particularly given the rise of home-based care, monitoring and recovery. Remote medicine and observation systems find it hard to tolerate network outages. Healthcare also demands the very highest level of security and privacy. Within the hospital 5G also opens up new possibilities too. Jones says: ‘Surgical robots in the theatre may be nothing new, but 5G would enable them to be operated wirelessly, with very low lag, dependably and - as there are no wires - sterility could be maximised.’

The Emergency Services Network or ESN also provides a glimpse of what network slicing could enable. It’s a communications system designed for the three emergency services and other public safety users. It is designed to replace a flagging system called Airwave. ESN is built on 4G but is as close to a private slice as current technology will allow. Critically it allows ESN to be prioritised above other network traffic.

Don’t forget the fibre

5G has the potential to herald profound change and disruption. But, BCS’s Rick Chandler sounds a cautionary note. ‘If 5G arrived in the hands of customers, businesses and engineers in one fell swoop’, he says, ‘it could well be a revolution on a par with the lightbulb. But, it won’t arrive like that. It’ll take time to mature and, worryingly, there may be factors working to slow this maturation’.

‘3G has only just passed peak,’ Chandler says. ‘4G is well on the way up at the moment, mainly held back by the poor underlying back-haul technology - the normal fibres in the ground that are needed to support it. I think that’s going to be holding back 5G as well. I think that that’s where the main investment needs to be made at the moment... It is in the underlying back-haul just to get the coverage.’

All that airborne data - carried along by massive MIMO and millimetre wave technologies - needs eventually to get back to a server. ‘It needs to be hosted on something at some point and that’s [achieved through] using the fibre infrastructure underneath,’ says Chandler. ‘Some operator mobile networks use wireless to do the high frequency back haul into the data centres, but that’s getting into areas of wireless, bandwidth and complexity that you don’t need. Most of it is hosted on fibre networks.’

‘And these fibre networks’, he says, ‘aren’t entirely up to the job of supporting 5G’. ‘Indeed’, he says, ‘the current infrastructure can’t always support 4G.’ And the cause? Back in the early nineties there was a craze for burying coaxial TV delivery cables in the ground. If the government of the day had mandated fibre instead of coax, we’d be in a better place today.

The situation, he reports, is improved now though. ‘The UK has a large number of what they call alt-nets, alternative network providers doing fibre in bucket-loads through their private operations. BT... acknowledged it’s keeping its fibre installation team going and concentrating on getting the fibres down. That’s what we need. The more data that goes through the [air], the more the fibre network has to support it.’

Pokemon, edge and VR

The recent craze for Pokemon Go gave a tantalising glimpse of the future, even if you’re not a devotee of the fictional creatures.

For the uninitiated, the game was played on mobile phones and was overlaid on the real world - via mobile phone - using GPS and augmented reality. Prized Pokemon were seeded across the real world and you had to travel to their geographic location to interact. It was compelling stuff, even if you’re not that way inclined.

‘It’s also an early look at edge computing in a mobile environment,’ enthuses BT’s Howard Jones. ‘Edge’, he says, ‘is all about pushing compute power towards the consumer.’

‘Most core networks’, he explains, ‘see the compute power within the network - the processing power is within the network.’ The network does all the thinking and all the heavy lifting in terms of managing the subscription layer of your customer. It manages the customer’s identity and also serves up the content and manages the distribution of the content around the network.

‘I think what you’ll see with edge within 5G, in general, is pushing all of that compute power away from the centre and out towards the edge of the network and closer to the customer. Away from the core network and into the cell site itself. And that’s about putting computing power and processing power into the base stations at site, ultimately. That’s the ultimate dream.’

The concept goes by the name of edge computing and it is enabled by hot topic technologies such as software defined networks (SDN) and network function virtualisation (NFV). ‘It’s about moving away from the limitations of hardware and to a very flexible and readily upgradable software capability. It gives you the opportunity to move your compute power closer to the customer which gives them a better experience of content’ says Jones.

Pokemon Go, Jones believes, is a great early example of what you can do if you take augmented reality and put it in a mobile, mobility-based environment. ‘It’s an early look at the sort of things edge computing will enable in a 5G environment. It will go to scales and experiences that we just can’t imagine yet. And I think that notion of “can you tell the difference between what the augmented reality experience is and what the reality experience is”… does it pass the Turing test… I think that’s going to start to become a real question that people ask. That’s because the capability will be so impressive. From a consumer perspective, that’s one of the key benefits of edge computing’s vision.’

Moving beyond today’s dumb, hardwired architectures has other advantages. Future networks may have inbuilt AI. They’ll be agile and able to learn. Today, if there’s a serious incident and it causes huge numbers of people to suddenly get on their phones and try to call in safe, networks can become overburdened and seize up.

A more intelligent network could stop video and voice calling and only allow texting. In other words, if the capacity rises, it will intelligently adjust so you will still have some level of connectivity.

All the Gs and LTE too

The first call made using a mobile phone was made in 1973 by Martin Cooper, a US-based Motorola executive. The first mobile call to happen in the UK took place in 1985. The call was made by Michael Harrison and Sir Ernest Harrison, the latter Harrison being the chairman of Vodafone.

Our journey to the brink of 5G started, unsurprisingly, with 1G. Introduced in the early 1980s, the 1G standard provided voice-only and was analogue.

By today’s standards, 1G phones are unrecognisable: bulky, heavy and cumbersome. Back in 1986, if you wanted to make a mobile call you’d need a car phone such as the Motorola 4500X - a leviathan whose large battery accounted for most of its 3.5Kg weight.

Things took a big step forward in the early nineties with the arrival of 2G - the second generation of cellular technology. The 2G standard’s main advance over 1G was digital encryption - you could no longer eavesdrop on mobile phone conversations with a radio scanner. Beyond security, the new standard offered full duplex telephony too.

2G systems were also more efficient and this allowed for greater wireless penetration. Beyond digital voice, the new system heralded the arrival of the short message service or SMS. By mid-2016 over six billion texts were sent a day in the US alone (Forrester). The new 2G standard also offered simple data transmission with speeds of up to 64kbps.

2G was based on the Global System for Mobile Communication’s Protocols. As of 2014, GSM became the global standard for mobile communications.

The dominant handset of the day was the Nokia 3310. The GSM phone was launched on 1 September 2000 and went on to sell 126 million units across the world.

The step to 3G wasn’t immediate. Interim evolutions included 2.5G, which saw new packet switching techniques that were more efficient. The interim standard saw data speeds increase to a maximum of 144kbs. At these speeds an MP3 song could take upwards of six minutes to download.

This standard is referred to as the General Packet Radio Service (GPRS). GPRS networks grew to become Enhanced Data Rates for GSM Evolution (EDGE). EDGE (not to be confused with edge computing) allowed for improved data transmission and was a cornerstone of the 2.75G technology - a key incremental advancement. Rudimentary web browsing became a cumbersome and slow possibility around this point in mobile telephony’s revolution.

The UK government allocated 3G licences in 2000. Five companies - BT Cellnet, Orange, One2One, Vodafone and Hutchinson Whampoa spend a combined £22.5 buying the rights. 3G was initially marketed as a means of making face-to-face video calls but, in the event, few people seemed interested. 3G did, however, have an ace up its sleeve: fast data transfer speeds.

Like the 2G standard before it, 3G evolved and, over the years, most networks were capable of delivering download speeds of 7.2Mbit/s (https://bit.ly/2MkFvvc). This boils down to a 100MB download in around two minutes. This also meant that mobile web browsing, social media and email on the go were all possible.

The above quoted download speed of 7.2Mbit/s is, of course, theoretical. Lots of variables, including signal quality, network congestion and whether you’re moving about will all serve to potentially slow things down.

The fourth generation of mobile technology is, of course, 4G - sometimes referred to as LTE or Long-Term Evolution. In theory, 4G supports 75Mbit/s upload speeds and 300Mbit/s downloads. Theory is, of course, one thing. Reality is quite another, and so Ofcom conducts its research.

It found: ‘the average mobile broadband download speed on 4G (15.1Mbit/s) was more than twice as fast as 3G (6.1Mbit/s) across all the networks.’. The regulator also found that a basic web page took an average of 0.78 secs to download on 4G. This compared with 1.06 on 3G.

The newer standard also drove down latency - the time it takes to deliver data - and, as such, low latency equates to a feeling of responsiveness. The average latency across all 4G networks was, Ofcom found, 55 milliseconds. This compared to 66.7ms on 3G.

Unlike mobile phone handsets, which have a very short shelf life and built in obsolescences, mobile telephony generations don’t fade away so immediately. According to figures published by University of Surrey, 2G was released in 1991 and saw its peak year in 2012.

3G was released in 2001 and will peak in 2022. 4G was introduced in 2007 and won’t peak until 2028. 5G saw its introduction in 2018 and isn’t predicted to peak until 2039.

All this means, according to Rick Chandler - chair of the Communications Management Association (a BCS specialist group) - that 5G’s arrival won’t feel like a leap forward. ‘It would if all came at once,’ he said. ‘Rather’, he explained, ‘the technology will build, develop and improve - just like all the other Gs before it.’