The March ‘24 Policy Jam posed this question - how can ethics and professionalism address the risks and challenges of emerging technologies?

The Post Office Horizon IT Scandal highlighted the need for independent standards of professionalism and ethics in technology application, development and deployment. How then can we avoid technology such as AI being at the centre of a similar miscarriage of justice?

Senior BCS members recently compiled a report: Living with AI and emerging technologies: Meeting ethical challenges through professional standards. It was compiled by Prof Bernd Stahl, FBCS Professor of Critical Research in Technology, University of Nottingham, Gillian Arnold, FBCS Immediate BCS Past President, and MD of the recruitment and diversity training firm Tectre, Adem Certel, Data Engineering Lead at Knight Frank Property and Dr Neil Gordon, Computer Science Lecturer, University of Hull. Three authors and Georgina Halford-Hall, Chief Executive of WhistleblowersUK, were on the BCS Policy Jam panel.


The good and the bad of AI

Technologies like AI hold immense promise for societal advancement, revolutionising industries and reshaping norms. From personalised learning experiences in education to early disease detection in healthcare, the applications of AI are vast and transformative.

Adem, a data engineer and a member of the BCS Ethics Committee, gave an example of how AI has changed his sector: "A lot of what we do is join data sets together, get more information and analyse large data sets. It can be quite difficult, but AI can make it easier." However, he said there were concerns over transparency surrounding the collection of data with AI and how accurate its analysis was. For Adem, the easing of work pressures must be balanced with checks within those processes.

But emerging technologies risk exacerbating societal inequalities and ethical dilemmas without proper oversight and accountability mechanisms. Gillian, the most recent past President of the BCS, drew attention to the issue of bias in AI algorithms. She runs a recruitment business that specialises in finding jobs for women in tech: "It became apparent that women were not selected for interviewees because there was a piece of technology scanning through CVs to understand who the best candidates were.

"Because that technology looked at the output and then reworked its algorithms to decide who were the best candidates, and because it found that 80% of the candidates were male, it thought the best candidates were male."

Bernd said: "If you look at the tech industry's history, the computer science industry has been very good at evading accountability for a long time. A key aspect of dealing appropriately with technology in the future is ensuring that people who cause something are identified and that there are appropriate and proportionate consequences. The whistleblower debate is an aspect of accountability."

Georgina Halford-Hall, Chief Executive of WhistleblowersUK, is promoting the proposals in a Whistleblowing Bill to create an independent Office of the Whistleblower that protects every citizen.

Usually, the core complaints her organisation receives are from workers in areas like health care and the public sector, but that's changing she said: "Literally in the last six months, we are seeing a huge upturn in the number of people who work in the technology sector coming to us."

"It's very difficult for people in tech, many of whom are working in alternative ways. Politicians need to get on top of this quickly because I think they don't appear to understand the risks that people face due to being in an unregulated working environment and not being protected by what I would say are very weak laws."

She said regulatory frameworks should encompass robust whistleblowing protections, safeguarding individuals who raise concerns about unethical practices or misuse of technology. She added that ethical leadership is key: "The Bill is about driving good, positive behaviour. It isn't about punishing people. Taking on Bernd's point about accountability, leadership is about accountability, and this Bill will hold people accountable. It will improve the tone at the top because the intention is to drive curiosity and for people to take responsibility for their sector and drive up those professional expectations."

Professionalism in IT

The panel strongly felt those who work in IT are core to building trust by upholding high standards. BCS, as a professional body, can award Chartered Status to those skilled workers who satisfy a strict criteria of competence. They then appear on a public register of Chartered IT professionals. Gillian emphasised the importance of this qualification: "I think that CITP could be a way we say to people: 'Look, this person is ethically sound and technically capable'. Recruiters would look for CITP on CVs. It would be a good way to say this person can be trusted."

Bernd said: "We need to be able to trust the people, [the IT workers], and that means there has to be some sort of professional recognition. I think it will happen for the same reason it happens in medicine, law and accountancy because people recognise that these professions need to be regulated. I think we are going down that road. Maybe it'll look a bit different, but I think professionalism is almost unavoidable at this point."

Adem noted it could be similar to environmental, social, and governance reporting, which sets the standards for the impact of the business on society and the environment and how transparent and accountable the organisation is.

Regulation and Innovation

There's been a lively debate around regulation and AI, especially following the open letter from prominent AI and tech experts calling for a pause in advanced AI development last year. That call wasn't heeded and is anything companies have instead sped up their efforts to build more advanced AI systems. BCS advocates that the best approach isn't to pause but to help AI grow responsibly.

The EU and the UK have taken different approaches to regulation – with the EU opting for prescriptive legislation through the AI Act, while the UK has adopted a non-statutory principle-based framework.

However, Adem said the UK could learn from the EU: "The EU has all its AI guidelines and everything else out there. There's something there for people, industries, and businesses to look at. I think we need something similar."

Gillian agreed and added that regulatory frameworks already exist in, for instance, the financial sector: "We've managed to regulate a huge financial industry with the Financial Conduct Authority, and they're all disparate players, and they're all out there to make a profit, and we don't stop them doing that. We just make sure that everybody's protected. I think there could be a role for a similar kind of organisation that looks at the breadth of technology from whistleblowing and ethical software development to online harms."

Bernd added: "There is also standardisation and other types of rules. There's professionalism, and BCS is, after all, the professional body, and that's an important role. There's a whole array of options that we need to look at. The interesting part is - how do we get them to work together? How do we make sure that they synergise, and overall, does this lead us to have the uses and consequences of technology that are desirable and beneficial to all?"

Learning together

Questions from the audience revolved around the power of vested interests and the part that human nature plays in not wanting to admit when things go wrong. Georgina said: "I think we can learn from other industries. However much you criticise it, health services are trying to bring in an embrace and learn, not blame culture. The technology sector is probably in a better position to bring that in because you haven't had quite so many catastrophes that we know of - at least not yet."

For you

Be part of something bigger, join BCS, The Chartered Institute for IT.

As for whether a scandal similar to the Post Office could befall an AI system, the panel thought it was only a matter of time. Bernd said: "It's inevitable, and it may already have happened. The point is not so much whether there will be such a future scandal and when it will happen. The question is how do we create possible consequences or reactions to it.”

One of the points raised by the Head of Policy, Dan Aldridge was that there should be a deliberate knowledge management approach to help others from replicating the same mistakes. This could improve the way AI works across governments and organisations, which the panel agreed with.

Election asks

With an election on the horizon the panel were asked what they wanted to see in the political parties manifestos’. For Georgina it was getting the Whistleblowing Bill through Parliament, and establishing an Office of the Whistleblower. For Adem it was legislation and for Gillian it was important that the government move fast on legislation and resourcing the regulators. For Bernd it boiled down to this: “The key is creating accountability, it’s about what Gillian and Georgina have also said - regulation, and creating a regulator are all part of it. And that's why if I had a thirty second message to the political party going up to the next election, I would say ‘do whatever helps you establish accountability in a reasonable and appropriate manner.’”