What are your aims for the year?
My aim for the year is to start to bring about a situation where anyone engaged with IT considers the ethical dimension of what they’re doing. And by the term ‘engaged with IT’ I would include conceiving a system, specifying it, designing it, developing it, testing it, implementing it and using it. So, at all stages of the development and use cycle, before you begin to do anything, you just stop for a brief moment and ask yourself, ‘Is what I’m about to do ethical?’ It doesn’t take long and it’s not difficult but, of course, it’s not what we typically do now. In most cases the answer will be ‘yes’ or ‘ethics doesn’t really arise in this particular situation’, but in some cases it will need more consideration. And then I would hope that our members, and indeed other developers and users of IT not limited to our membership, would have the presence of mind, the intention and the tools just to ask themselves that question.
What should be the basis of their decision-making?
I think everybody knows what’s ethical and what’s not. There will be borderline cases, but they won’t be the majority. You know, we have an innate sense of what’s ethical. We can override it, we can get into the habit of overriding it and then it gets weaker. But just asking yourself those questions: ‘Who is going to be affected by what I do? Are they going to be affected for the better or for the worse?’ is important. And it’s not just legal - there’s a distinction between the ethical and the legal and we’ve seen many examples of people doing something which is legal but unethical. If it’s illegal, it’s probably unethical. But then you have the sanction of the law. If it’s unethical but legal, you don’t have the sanction of the law. So, we’re relying on somebody’s conscience and their awareness of the effect of their actions on their fellow men or women.
What do you think the BCS role is in making people more aware of ethics?
Our mission is to make IT good for society. We cannot do that unless what we do as an Institute and as a group of people is ethical. So, I would argue that ethics is actually at the core of everything we do. And certainly, it’s key in our achieving our mission in every sphere of activity.
So, if somebody comes to the conclusion that something they’ve been asked to do is not ethical, how should they proceed?
It depends on the circumstances, there’s no one answer to that question. But they should, firstly, refuse to do the job. It also depends of course on who’s asking and how and what role they’re playing - there’s all sorts of variables in the situation. But you have to do what’s right. And sometimes there’s a cost associated with that. For instance, I co-wrote a book called From Principles to Profit - The Art of Moral Management. And we formulated this proposition that if you act ethically to the highest ethical principles then in the long-term, not necessarily in the short-term, the result will be profit for the business.
Now, in order to test that proposition, we talked to friends and clients and business associates and we said: ‘Here is our idea, is that your experience?’ We had a good number of responses. And, in several, our respondent said: ‘Well, this is the situation. This is what I was asked to do. And I decided that it was unethical. And therefore, I refused.’ In some cases it cost them their job. They all said, ‘I’ve never regretted that decision for a moment.’
Might that be because they have a personal level of confidence and authority in their jobs already? I’m thinking of somebody that’s perhaps just starting out, that hasn’t yet got that professional confidence.
It can indeed be more difficult if you are new in a job but the challenge can be at any level. Let me give you one example. This was a friend and client who had been general manager of an overseas bank and one day his chief accountant came in and said: ‘We’ve just received this cheque for a million dollars and it’s in my name. It’s not really for me, it’s for this general. And the general’s account number is just one different from mine. So, don’t worry, it’s not for me.’ And my client said: ‘This stank. So, I called my head of compliance and discussed what I should do and I acted on his recommendation.’ The advice was: ‘You call the regulator, then you call the police, then you call head office. In that order.’ And the head office said: ‘Why didn’t you call us first?’ It was obvious - they would have stopped him acting. And, indeed, he lost his job shortly afterwards. So, he was in a senior role but the issues for him were the same as if he had been in a junior role.
Do you think the BCS code of conduct goes far enough with these issues? Is there more we should be doing there?
I think the answer to that is the BCS code of conduct could well benefit from being revisited and it’s one of the things I want to discuss with colleagues; how we should do that. So, I think that’s a very good question. I think it doesn’t go far enough in this context. Changing it will be a significant project, we certainly need to consult with the membership on that once we have got any new proposals. However, I don’t want to wait until that’s complete before we start developing these ideas about the implementation of ethical principles.
If you could just put yourself in the future for the moment, to the end of your year, what would you like to have achieved by that point?
I’m not sure exactly how far we’re going to get. This is a journey and we’re only just starting out on that journey in a sense. We’re not inventing ethics, there’s nothing new about it. But I’d like to see us well on the way to some thought leadership, particularly in the area of AI and ethics. There’s lots about that in the press at the moment. It’s a lively discussion but I think BCS has a major contribution to make.
Secondly, I’d like to see us producing an ethical code of conduct, which might be wider than our membership, therefore anyone who wants to pick it up and refer to it and have it provide some help in making those kinds of decisions can. I’d like to see some training in how to use such a code and any other tools we can devise to help, because sometimes the questions are quite tricky. You’ve raised some good questions which, individuals, particularly if they’re in relatively junior roles or relatively new roles in their organisations, may find challenging. So, I think we can do a lot to help. How much we can get done in 12 months, I’m not sure.
How large is the issue of unethical behaviour?
Well I would say there are good actors, there are people who do act ethically as a matter of principle and practice. And there are definitely bad actors, who are misusing the technology deliberately, consciously and actively. I’m not sure we’ll ever touch them. And then there are people in the middle who don’t really think about it. And I think it’s those who are most in need of assistance.
Do you think that BCS can do a little bit more with its view of professionalism, as ethics is something that’s much easier to engage in when you have the strength of a professional body behind you?
Yes, I think that’s entirely valid. You can’t be professional if you’re not ethical. So, the two go hand-in-glove. So, my answer to that is definitely yes.
Can you give us examples of things you’ve observed which you consider to be unethical in the way it’s actually performed? (Ed: This was before the Cambridge Analytica story broke!)
There are lots of examples around. Take a very conspicuous example, the Volkswagen emissions scandal. In America that was illegal. In this country, it wasn’t. It was equally unethical in both. It was definitely deliberate, they’ve admitted it. But when they were asked why they weren’t compensating Volkswagen users in this country, they said it wasn’t illegal. More generally, major corporations are collecting a huge amount of information about us all the time, from our computers and, more particularly, from our mobile phones and tablets. And they use this to beam advertisements at us. Now they think we’re going to want to buy something.
Of itself, that’s not unethical but clearly some entities are misusing and selling that information to bad actors or for unethical purposes. There are obvious examples of bad - I use the word in a very general sense - bad stuff being carried on the major platforms. If we look at AI for a moment, one of the most significant problems in AI, particularly deep narrow AI, is that the system needs to be trained or train itself with large data sets. Now if those large data sets have inherent bias because of the way they are collected, or collated, then that can result in an unethical outcome.
Are ethics high enough on the agenda in the political sphere?
There are a number of government and parliamentary initiatives in relation to IT and data ethics and I applaud that. For instance, there is a new initiative to set up a cross-party Parliamentary Commission on Technology Ethics. We have already been interacting with Darren Jones MP who is leading it and intend to contribute actively to its work.
How can members get involved in your year’s activity?
I would encourage anyone who thinks this is a significant issue and who’d like to engage in the conversation to put their hand up. I don’t want to do this by myself, I can’t do this by myself. We had a meeting of the Society Board recently where this was the focus of attention. It was a very wide-ranging and useful conversation, which I think will have some really valuable outcomes in due course. I don’t want to prejudge, but we want to engage the membership. I’d be very happy to come and talk about these issues at any branch or specialist group which is interested. If people would like to engage with us, all power to their elbow and ours.
Any final thoughts?
It’s a bit of a qualification, I mean, I think we’ve been talking in quite general terms and that’s fine. I think the major focus is probably on the ethics of AI because that’s such a difficult area. Just to give you an example, there are people building AI systems who cannot explain how that system works and how it reached its conclusions, because they themselves don’t understand. Now a number of organisations such as IBM have come out and said: ‘That’s wrong and we’re not going to build such a system and we don’t think such systems should be on the market.’ So, if you can’t explain it, if you can’t justify its conclusions and show how it reaches its conclusions, then it shouldn’t be there.
It’s essentially a black box.
If it’s black box and unexplainable that, in and of itself, even if it reaches the right conclusion, is, in my book, unethical.
The June issue of the BCS membership magazine ITNOW ran a feature on various ethical areas:
- Inspiring the next generation
- When ethics and IT collide
- The ethics of learning
- Sustainable ethical IT
- Digital marginalisation
- Ethics and AI
- Ethics vs morality
- Social mobility in IT