The general public is beginning to wake up to the importance of trustworthiness in systems, writes Alastair Revell CEng CITP FBCS.

It is a slow awakening, but people are becoming increasingly aware of how their lives are affected adversely by poorly written software systems.

Recent high-profile events are entering the collective consciousness and a narrative is forming that software professionals need to heed. In the wake of diesel gate, the public has seen how systems can be manipulated to undermine the public interest; with the A-level exam grading debacle, a generation of students learnt that much of their future may be determined by algorithms; and many saw how the NHS was brought to its knees by the WannaCry virus that exploited security flaws in unpatched systems.

Is it acceptable for corporates to deceive the public using software? For young lives to be determined by algorithms that are not fit for purpose? Or lives put at risk because not enough effort is being put into removing defects?

The devastating consequences of insisting a software system is trustworthy when it manifestly is not is well illustrated by the British Post Office scandal, which arguably has ruined many innocent lives.

All these systems undermined trust in one way or another. How can consumers and buyers, or simply those affected by them, know that they are trustworthy?

One approach is to demand that systems claiming to be trustworthy measure up to an external, auditable standard, where a trusted third party has assessed conformance against a public standard. This could assure relying parties, such as the general public, that these systems were safe, reliable, available, resilient and secure.

Some of the necessary underpinnings to this approach are being laid for the future.
In June 2014, a minister of state launched the British Standards Institution (BSI) BS PAS 754 on Software Trustworthiness in London. At its heart were the five important facets of trustworthiness: safety, reliability, availability, resilience and security. This was the culmination of almost a decade of work, largely sponsored by government, which can be traced back to a study by the Cabinet Office on the central sponsorship of information assurance in 2004/5, which identified a pervasive lack of secure software development practices as a matter for concern.

Much of the hard work in producing the publicly available specification (PAS) was undertaken by the Trustworthy Software Initiative (TSI), which was an emanation of government funded by the National Cyber Security Programme I. In 2016, the Trustworthy Software Foundation was formed as the TSI’s independent corporate successor to signpost and curate the material that the initiative had produced over the preceding decade (see tsfdn.org) as well as to encourage further development.

February 2018 saw the publication of the BS 10754-1:2018 standard on Systems Trustworthiness, which built on the initial work of PAS 754. This standard includes the comprehensive Trustworthiness System Framework (TSFr), which provides an implementation-agnostic and domain-independent way to reference the large body of existing knowledge on building trustworthy systems. This framework covers functional safety and information security as well as aspects of both systems and software engineering.

BSI Committee ICT/3 is currently working on the production of the next two parts of the BS 10754 series, which will cover assurance and application security cases. There also remains an aspiration to publish an International Organization for Standardization (ISO) standard on trustworthy systems in due course.

More recently, the Trustworthy Software Foundation has been developing the Trustworthy Systems Mark (TSM), which assesses conformance with the BS 10754 standard. The TSM programme is envisaged as a three-level scheme with each level demonstrating increasing conformance. Level 1, which is currently in pilot, is a self-assessment scheme overseen by the Foundation and is similar in concept to the NCSC’s Cyber Essentials. This light touch scheme assesses basic alignment with the principles that underpin trustworthy system development.

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

Level 2 conformity, once fully developed, will require a more thorough assessment and will involve external verification. A third level is also planned, which will give extensive assurance to relying parties.

These are small but fundamentally important steps towards assuring trustworthiness using a standards based approach that deserves much wider attention and, indeed, adoption.

The time is now approaching when buyers could demand that organisations whose systems they depend on measure up to an external, auditable standard on trustworthiness. These organisations, in turn, will soon be able to demonstrate that the systems they use are safe, reliable, available, resilient and secure.

The public arguably are still processing the consequences of how untrustworthy systems affect their lives. They will inevitably start asking questions in the future about whether it needs to be like this and whether there is a better way.

The argument for professionalism in IT

As the public begins to understand how the practice of software engineering can affect their lives dramatically, they will want assurances that it is being done properly. This has happened many times in the past in other disciplines, such as railway engineering, bridge building and aviation. It is not just life-threatening circumstances that drives the need for assurance, it is common wherever the risk is ‘expensive’ should it materialise.

There can be little doubt that our lives are becoming increasingly reliant on technology. We therefore need the tech in our lives to be reliable to avoid serious consequences should it fail. The fact that our safety may be at risk because of untrustworthy software embedded in systems such as autonomous vehicles is as serious to us as exploding boilers in steam trains were to the Victorians. The lack of availability of the digital services we rely on, however caused, has a serious impact on our lives. The immense damage that weak security or poor resilience can cause in our digital world is increasing dramatically.

While we know how to remediate most of the issues that lead to untrustworthy systems, the evidence strongly suggests that there is little compunction to apply our learning. However, at some point, the public will decide the risks to them have become too great and declare that we have crossed the Rubicon. While enjoying the benefits of amazing tech, they will want to mitigate the risks they face by demanding assurances about its trustworthiness.

It is worth dwelling on the fact the root cause of many cyber exploits has its origin in poorly designed software that has simply ignored security by design. A good example of this is the seemingly ever-present buffer overflow error that was first documented at least as early as the 1970s and is well-understood, but nonetheless many of the flaws regularly being patched today boil down to this basic design error.

Consider another engineering profession for a moment. The Tacoma Bridge over Puget Sound in Washington State collapsed in November 1940, primarily due to aeroelastic flutter that was not properly taken into consideration in its design. (The collapse was captured on film by an amateur and the dramatic grainy footage is widely available on the internet.) Importantly, bridge engineers learnt a great deal from this epic engineering failure and have guarded against this phenomenon ever since in their designs. This is in sharp contrast to buffer overflow errors in software that have now persisted well over 50 years and are just as well-understood.

At some point in the near future, public tolerance for software failure will be surpassed and they will demand software they can trust.

Will you and your organisation be ahead of the curve?

About the author

EUR ING Alastair Revell CEng CITP FBCS FIAP is the BCS Vice-President for Community, Chairman of the Trustworthy Software Foundation, a Trustee of the Engineering Council and Director General of the Institution of Analysts and Programmers.