The world is becoming ever more reliant on code but how do we ensure software quality? A recent event hosted by the BCS Software Testing Specialist Group explored these issues and more.

Data is the new oil and global telephony networks are the tunnels and pipelines around which this information is pumped.

The usage statistics about these increasingly mobile networks themselves – their volumes, velocity and variety – are astounding.

According to a report published by GSMA, in 2021, there were around 5.3bn mobile phone subscribers – that’s 67% of the world’s population. By 2025, it’s likely there will be 5.7bn mobile users which amounts to 70% of people being connected. That’s just humans connecting and exchanging information though.

When it comes to machines talking, it’s reckoned there were around 15.1bn connected IoT devices in 2021. Fast forward to 2025 and this number could see 23.3bn devices sending information, receiving instructions and generating data. All this, and more, shows the need for new technologies like 5G.

All of this has changed how business function, are shaped and how we work. It’s creating new jobs, new opportunities and new challenges for engineers.

Towards a software based future

Underpinning these networks is an unseen hand: software. The network operators, driven by a need to meet this global demand for ubiquitous connectivity, low latency, high speed, huge data volumes, reliability, scalability and security have embraced softwarisation. In short, we need to ensure we create high quality software.

Softwarisation of telephony sees many of a network’s key facets – design, implementation, function, management and monetisation – being driven by software, much of it cloud based.

It’s against this backdrop that the BCS Software Testing Specialist Group (SIGIST) hosted its conference. The London event was part of the much larger Birmingham based UK5G Showcase. Proving its own point about the importance of networks and software, the two events were linked by a live video feed which enabled geographically dispersed BCS members and guest speakers to share opinions and ideas.

‘The pandemic made for prolific working from home. Schools went online… [there was huge] growth of the digital economy,’ states Dr Mike Short CBE, opening the event. ‘We need to live so much more of our lives online and this will depend on software. We need to think about softwarisation.’

Making a stark warning, Short adds: ‘If we don’t [think about softwarisation], we won’t get the future we want and, indeed, the future we need. As more and more data is generated, we have a duty to our customers. We need to make networks of networks more efficient. There’s no going back.’

Software’s place in a changing world

The pressures on our software based communications networks are shown in stark relief, Short explains, when we consider global warming. On one hand, given these networks’ size and complexity, there is a huge need to engineer for energy efficiency and re-use.

IoT based networks themselves will also play an increasingly important role in helping countries make better use of the energy they generate, transmit and consume. From smart meters to smart electric car chargers and smart grids, these solutions all depend on mobile data.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Software, Short explains, is also becoming increasingly important in helping us achieve the World Wide Web that we want: a safe, fair and open web, which is protected from the proliferation of common toxins such as abuse and fake news. Software, he says, will again play an increasingly important role in regulating the web and also in helping users turn their responsibilities into action.

Given software’s criticality in enabling global networks, the SIGIST event focused on exploring how software quality can be enhanced.

‘Why is writing software so hard?’ asks Martin Hogg, Director and Cybersecurity Consultant, Picasso Security.

To help answer this question, the event adopted a ‘triple helix’ approach where academic, industry and government speakers offered insights about topics such as controlling software supply chains, software development methodologies and the more human and cultural aspects of development.

The importance of cloud

Cloud, many of the speakers agree, is a critically important technology which works hand-in-hand with softwarisation.

Joost Noppen, Principal Researcher Software, BT, explores the challenges inherent in shifting a bricks-and-mortar business towards cloud-first thinking. Often, technical legacy holds organisations back.

He points to how many businesses still depend on old systems which are built in languages like Fortran. The experts who made them might be retired and the systems themselves can’t simply be lifted and shifted to the cloud. Old hardware, he explains, can also be very hard to virtualise and move to the cloud.

With these slowing and dragging factors, it can be even harder for forward thinking organisations to move towards a culture where workers feel empowered to make software. Noppen details the ‘citizen developer’ – people who are equipped, have skills and are encouraged to make software artefacts without first needing a three year degree.

Testing and trusting AI

The need for robust and resilient software is also critically important when it comes to building and deploying machine learning and AI based systems, explains Adam Leon-Smith, the event’s organiser and chair of SIGIST.

‘We have a lot of challenges convincing people machine learning can be trusted,’ he states. ‘It is easy to make toy examples but it is much harder to [build systems] which scale across different problem spaces.’

In many ways, says Leon-Smith, AI is a double challenge for engineers. Firstly, the underpinning data sets need to be correct, well organised and free from biases which might lead to incorrect assumptions. And secondly, the software which enables these systems needs to be reliable.

Testing such systems, he explains, is a radically different challenge from testing traditional software. When testing a more conventional system, you can explore whether the software produces predictable results. But, how do you test an AI system designed to produce results you can’t predict?

‘We need to move testing out of the model creators’ hands,’ he continues. ‘We need to move it out of data scientists’ hands and away from confirmation bias.’

The human factor

Achieving – or at least enhancing – the quality of the software we produce is clearly a multi-faceted challenge. Tom Geraghty, Transformation Lead at Red Hat Open Innovations Labs, points to a critical but sometimes overlooked factor: humans.

Geraghty makes a link between the quality of software and the cultural environment in which the engineers work. Specifically, leaders who create a sense of psychological safety often find their teams perform best and make the safest products.

The idea has its roots in the life and work of Grace Hopper (1906 – 1992). Hopper was a true computing pioneer – she was a computer scientist and also enjoyed a highly successful career in the American navy where she became a Rear Admiral. She was also an advocate of humanistic leadership – the idea that ‘you manage things; you lead people’.

The idea of psychological safety, Geraghty explains, can be best understood by looking at where people in authority choose to manage people, just as they manage any other resource – usually autocratically.

In these organisations, fear tends to be the pervading mood among workers. In these sorts of settings, engineers don’t feel safe putting their hands up when they spot a mistake and certainly don’t admit errors they might realise they’ve made. Rather, they stay quiet.

Geraghty points to the Chernobyl nuclear disaster as being caused, in part, by engineers being too frightened to report operational problems with the Pripyat-based reactor.

Somewhat counterintuitively, this means that the best performing software teams often report more errors and issues than groups that aren’t performing so well.