In recent weeks, the general public have seen hospitals turning away ambulances and airport terminals grinding to a halt because of confusing and murky ‘IT’ issues. The explanations for these events are increasingly unsatisfactory, if taken through the eyes of the general public.

As a charity that exists to ensure that IT is good for society, we don’t believe that it is acceptable for ‘IT failure’ to be treated as a ‘force majeure’. Technology operates and fails, often in unpredictable ways, but the discipline of engineering is all about ensuring that unpredictability is contained and dealt with in predictable ways; from the failure of a concrete strut in a bridge to an engine component on an aircraft, that failure does not result in an uncontrolled outcome.

The fallacy is to think that technology is somehow inherently unmanageable. It is not. The X factor is, as usual, entirely around people. Most failures are as a result of systems of people failing to properly manage risks around technology, failing to communicate effectively, failing to work together - and managing personal risk rather than systemic risk. An exercise in ensuring system-level integrity that supports the general public can descend into an exercise in prophylactically apportioning blame. When something goes wrong, everyone has a reason why it is someone else’s fault, properly documented.

Organisations where technology is central to their business - the global players in technology, for example - by contrast show us that they can maintain availability and discipline to a spectacular degree, and against spectacular external threats. With the will, it can be done. The ‘physics’ of software and systems is harder to pin down than for concrete or jet engines, particularly when it comes to system-wide failures, but conceptually the engineering principles apply.

What’s different is the people-context where decisions are often made. The issue comes for organisations where their business fundamentally relies upon technology, but they don’t see technology as their business. As organisations have implemented more pervasive and complex technology solutions in business, there has been a lag in both our understanding and our systemic approach to ensuring a good outcome. For some reason we’ve not processed that IT-related risks are often as serious and even more pervasive than other types of risk.

Generations of software and systems engineers pointing this out, and asserting that it can be done properly, have not resulted in the elimination of failure in the way that engineers have largely eliminated deaths in aviation or civil engineering. For example: ‘Why are complex IT projects different?’  …now more than a decade old.

Over more than a generation, we have exhausted ourselves asserting that proper engineering practice is the answer, and we’ve realised we need a different approach. The basis of our Charter, the basis of the values of our members, is that they stand for what’s right for the public. Professionalism for us does not simply mean documenting that you did everything right and someone else failed to listen - or if it does, then it is a word that has lost its meaning. For us, professionalism means taking responsibility for the impact of what we do on real people.

No single professional can accomplish such a system-wide change. It’s hard for a professional group to do so. It is also difficult when there is little or no moral hazard for the board of a company when they can say ‘IT failure’ and receive some level of sympathy.

What’s needed is a professional community that asserts its public focus and accountability, to stand with the general public and demand that this is not acceptable. The general public need to know that we are on their side, that we take responsibility for doing the right thing.

This is why we talk about ‘Making IT Good for Society’; not because it is a nice fluffy story, but because the hard task set us in the Royal Charter is to take responsibility for outcomes. We fight for the public, as a profession, not for the profession’s own interests. We need the general public to recognise this need and demand such a response.

Much of what’s going on in the digital space is evolving, and it often isn’t very clear what’s good for society, but when what’s good or bad is entirely clear and unambiguous, we need to give confidence that we can deliver. That is why we will continue to work across many sectors - such as health and care, cyber security, and across the board - for a status and level of expectation to be placed on individual IT professionals, alongside the means to meet that expectation, to really meet the needs of the public.