Artificial intelligence was at the centre of public policy discussion and media commentary this quarter. BCS’ community was sought after to provide context, insight and analysis on critical AI issues from regulation to public trust, as detailed below:

Our CEO Rashik Parmar was quoted in the launch of DSIT’s White Paper on AI Regulation, alongside organisations like DeepMind and Rolls Royce – explaining how a pro-innovation approach could work if supported by registered professionals. We also released thought leadership on why pausing AI development and worked with mainstream media to counter negatives narratives focussed on a ‘Terminator’ scenario.

In June, BCS released an open letter calling for government and industry to recognise AI as 'a transformational force for good, not an existential threat'.

Our statement has so far attracted over 1300 signatures, including from leading thinkers like Anne-Marie Imafidon, CEO of Stemettes, and Philosopher and Professor Luciano Floridi, from the University of Oxford.

In May, the BCS Fellows Technical Advisory Group (F-TAG) worked with key member groups such as Law, AI, Software, and the Internet Specialist groups to create a proactive position paper arguing against a 'pause' on AI development.

This followed an open letter in March by Future of Life Institute, and signed by Elon Musk, calling for a six-month pause in developing advanced artificial intelligence systems. The letter stated such AI systems posed 'profound risks to society and humanity'.

BCS members warned such a strategy would not work and could play into the hands of rogue regimes and organisations that were unlikely to stop developing AI.

In addition, BCS said a go-slow would harm humanity by delaying advances in medical diagnosis, climate science, and productivity. Sky News covered our paper exclusively; reaction later appeared in the Daily Mirror.

BCS CEO Rashik Parmar was widely quoted in the media as saying IT professionals should have a licence to develop AI. Rashik Parmar, chief executive of BCS, commented after the Competition and Markets Authority (CMA) launched a review into the AI market.

The review by the regulator came amid concerns that big tech companies such as Microsoft were becoming too dominant in this rapidly evolving field.

He said: "I would not want a surgeon to operate on me that didn't have the right kind of code of ethics, was competent, ethical... And yet we allow IT professionals to build and deploy complex technologies without the same level of professionalism. I think we need to have some level of professionalism that's certified in the right way."

In early June, Rashik was also widely quoted in the media, including the I-paper, Daily Mirror, Daily Express and Sky Online, MSN etc., saying the fears around Artificial Intelligence could partly be a legacy of films such as the Terminator. Rashik said: "There should be a healthy scepticism about big tech and how it uses AI, which is why regulation is key to winning public trust.

"But many of our ingrained fears and worries also come from movies, media and books, like the AI characterisations in Ex Machina, The Terminator, and even going back to Isaac Asimov's ideas which inspired the film I, Robot."

BCS organised two roundtable discussions for the government's Office for AI as part of a response to a government White Paper, drawing on the expertise of BCS members and policy contacts. The first was about the impact of the new technology on marginalised communities, which concluded whilst there could be benefits for groups such as ethnic minorities, neurodiverse people and LGBTQ+ people, there are also concerns around inbuilt bias.  

 

The second roundtable was held jointly with the National Engineering Policy Centre (NEPC) on managing risk and safety in engineering, how the regulatory environment outlined in the UK Government's White Paper could work in practice, and where it might need amendments.

 

Its conclusion included concerns about deploying AI prematurely without human oversight.  

There was also much public debate as to what the massive adoption of AI large language models such as ChatGPT would mean for education – was it a force for good or not? The BCS’ Education team, supported by its CAS network, recommended that AI should be part of teacher training courses to help staff understand how students use AI.

BCS placed a piece in The Times with MD for Education and Public Benefit Julia Adamson, making this case, and arguing that AI should be not just a crucial part of teacher training but also of headteachers' professional development.

This statement was also covered widely in the media following the subsequent announcement, in June, by Education Secretary Gillian Keegan that the government was to call on education leaders and technology experts to come forward with ideas on how generative artificial intelligence can be used in a positive way to support the education sector in the future.

Schools Week and FE Week also covered the setting up of a new education task force to advise the government, which includes the BCS CEO and BCS Fellows, Dame Wendy Hall, DBE FRS FREng MAE FIET and Professor Dame Muffy Calder, DBE OBE FREng FRSE FBCS, at its helm.

Niel McLean, Head of Education at BCS, was also quoted in a separate article about the potential use of AI for collecting data about school pupils. He said: “There's an ethics of purpose – what you're actually using this to do? There's an ethics of processes – how is data handled? What's the confidentiality? How secure is it? There's a people side. You want the people doing it to be professional and feel they're accountable.”

Mid-April saw earlier BCS research, and the BCS CEO again quoted by analysts discussing the government proposals in the Online Safety Bill, which could break end-to-end encryption in messaging services. WhatsApp and Signal wrote an open letter to ministers asking them to reconsider the wording of the Bill, which would allow third parties to monitor messages on currently encrypted platforms. BCS research with members found overwhelming support for protecting encryption in the Bill. Coverage appeared in outlets from Computer Weekly, The Times through to The Register.

BCS spoke to the BBC about the decision by the owner of Twitter, Elon Musk, to limit the amount of tweets people can see in a day.

Adam Leon Smith, chair of F-TAG, said it is "very odd to start rate-limiting the reading of a social network", as limiting users' scroll time "will affect advertising revenue".

Julia Adamson, MD of the BCS Education and Public Benefit, was also invited to appear at a House of Lords Select Committee hearing on the challenges and opportunities for computing education.

Julia's points around innovative solutions to teaching the subject, including bringing in industry and the need to broaden and balance the computing curriculum, are part of the formal record. This will also be part of our key manifesto 'asks' as we engage with major political parties before the next general election. We followed up this evidence with formal written evidence, now on the committee’s pages.

Julia was awarded an MBE for Services to Education in this year’s King's Birthday Honours list. Global IT Entrepreneur Dame Stephanie Shirley, a Distinguished Fellow of BCS paid tribute to Julia’s inspirational leadership and passion for sharing the life changing benefits of computing education in our announcement.

The Bank of England approached BCS to hold a joint BCS members-only webinar on proposals for a Central Bank Digital Currency (CBDC) – also known as the Digital Pound. Our members concluded that the CBDC should be technically possible and could deliver additional security, speed, and accuracy. However, members felt there was some way to go to make the case that a digital pound was needed - as there are already many ways to pay digitally for goods and services – and more detail was needed as to how those who are  digital excluded would be able to access this service.    

BCS took a leading role in a newly formed All-Party Parliamentary Group (APPG) to consider the risks and potential of the Metaverse and Web3 (a concept which includes emerging technology like blockchain). We commissioned a new members survey, receiving over 1000 responses, which found that child safety and regulation were key challenges for this emerging technology. Our data informed the evidence-gathering session led by the BCS CEO and Baroness Uddin.