I tend to fly a lot, it is an occupational hazard of living in New Zealand, a long way from anywhere. I’ve been through a few hairy moments: landing in raging storms and dense fog, emergency landings greeted by a posse of fire engines and ambulances, and aborted take-offs that left a trail of fuel across the runway.
However, nothing has instilled as much fear as the words I heard as we were preparing to take off for Sydney two weeks ago. The pilot (with an inappropriately calm and jovial voice), announced as we were proceeding along the runway that: ‘we are just going to restart the computer system and then we’ll take off ’.
What? No pause for testing? Maybe a casual check to see that everything did actually restart? A moment’s delay to check for further error messages? I envisaged; ‘Your navigation system has inexplicably closed - please reboot the system and try again.’ Or maybe a simple flat line and ‘Game Over’ message appearing?
Anyway, it turned out that the pilot was just rebooting the entertainment system. But what if restarting the entertainment system had restarted the flight management system? What if the function that needed restarting in the entertainment system was related to a piece of reusable code from the navigation system or the fuel management system?
You might consider that I am overthinking the situation, but at least one plane manufacturer has in the past had the scary but economical thought of combining the flight management and entertainment systems into one code base.
My main concern though is that we often don’t encourage our software developers to consider the consequences of their coding beyond time and cost to deliver, and we don’t present our software solutions to our sponsors and business owners in such a way that they can consider the consequences.
Even when we consider the consequences of delivering a particular software system, we cannot consider the implications of how this system will integrate and interoperate with the wider world of the internet of everything.
According to Reuters, in May 2014, ‘A common design problem in the US air traffic control system made it possible for a U-2 spy plane to spark a computer glitch that recently grounded or delayed hundreds of Los Angeles area flights.’
The Reuters’ article goes on to say, somewhat reassuringly, that, ‘In theory, the same vulnerability could have been used by an attacker in a deliberate shut-down, the experts said, though two people familiar with the incident said it would be difficult to replicate the exact conditions.’
If we are not considering the IT implications of interoperability between systems, it is even less likely that we are considering the ethical use of IT systems. Since the automation of tasks that were previously carried out manually, the ethical and human change elements have often been lost in the stampede to save cash and time.
Of course, my idea of what is ethical will most likely differ from your opinion.
Pop quiz
Here’s a little IT ethics pop quiz to work out where we all stand:
Question 1: Do you have the right to be forgotten on the internet?
Question 2: If you are an information services provider, is it ethical to remove records without notifying your users, so that your service doesn’t always cover the same breadth?
Question 3: Is it ethical to filter client-to-client data as it passes through your application, screening for good news and bad news, and holding back bad news?
Question 4: Is it ethical to use location services to keep track of a member of your staff and his partner without their knowledge?
Question 5: If I automate the job of one of my staff members such that they are doing exactly the same job with the same outcomes, but it changes the experience of their job significantly, am I acting ethically if I don’t provide support / counselling?
Question 6: Is it ethical for a health services provider to update medical implants without the knowledge of the implant owner?
I’m sure you won’t be surprised to hear that all the above questions represent real-life examples of events that have happened. In fact questions 1 to 4 are reflections of recent news stories.
For question 5 consider the extreme case of the trained pilot who used to fly aircraft into war zones, drop bombs and return home, and now sits in an air conditioned office controlling a drone to carry out the same task of destruction. Though he is in a more comfortable and a far safer work environment, he now gets to see his victims perish before his eyes.
For question 6, consider the software developers who came up with the idea of updating pacemakers remotely as heart patients passed the office of their physicians. All went well until the software was hacked into, and maybe the patients would have stood more chance of survival if they had known that what they were experiencing was a software upgrade.
So if you answered no to any of the questions above, what can you do? Well, I believe there are several areas where we can contribute to better ethical outcomes.
Firstly, as IT professionals we have a duty to build a mindset of considering the wider consequences of the IT solutions our developers design, and the reusable code that gets reused.
Did the software developer who wrote the code for automating the opening and closing of supermarket doors envisage that his / her code would be reused with fatal consequences in an x-ray machine? Probably not, but surely we could expect that the re-user of the code should have considered the full working details of the code he / she had copied?
Secondly, we need to contribute to the wider debate of how IT solutions are used, and how ethical decisions are made around IT-enabled concepts. The debate has long since bolted the IT stable door. We see significant politicians debating the use of IT solutions, such as drones, and judges making decisions relating to IT-enabled concepts such as privacy, often with inadequate and sometimes misguided briefing.
So finally, let’s get our professional bodies involved in leading the way to develop policy and opinion pieces before our politicians enforce laws, or our judges pronounce life-changing judgments that result in even greater ethical issues.
Governance of IT
Written in two halves, this valuable book is designed to bridge the gap between the governing body and CIOs / IT managers. It will help them to create a safe and robust governance framework for their organisation by applying the principles of the ISO Governance of IT Standard 38500 on directing, evaluating and monitoring IT activity.