Building ethical AI means shaping systems that think with moral clarity, not just logic. This talk explores psychology guiding that future.
Speaker
Chris Ambler BSc(Hons) FBCS GMBPsS
Agenda
7:00pm - Event start
8:00pm - Event ends
Synopsis
Artificial intelligence is no longer just a technical challenge, but it is a psychological and moral one. As AI starts to make decisions that can affect people, communities, or even entire societies, the real question we need to ask is 'How do we make sure that these systems behave ethically and not just efficiently?'
In this talk, Chris Ambler draws from his book 'Psychology of AI Decision Making' to explore how insights from moral psychology, behavioural science, and classic psychological experiments can guide the development of ethical AI systems worldwide. From the trolley problem to Haidt’s elephant and rider and from compliance behaviours to cultural differences in moral reasoning, Chris shows how the human mind, not the machine is the key to understanding AI morality.
The session cuts through the noise and asks the questions that matter:
- What does it mean for an AI system to make a 'moral' choice?
- How do psychological biases and cultural norms shape that decision?
- Can we design global AI frameworks that respect different moral worlds?
- What happens if we don’t?
Blending real-world case studies with clear psychological models, Chris demonstrates how ethical AI cannot be engineered in isolation. It must be understood, tested, and governed through a lens of human decision-making.
This is a talk for anyone who wants to make sure AI doesn’t just work but behaves ethically. It’s a call to build systems that hold moral weight, protect human values, and set a global standard.

About the speaker
Chris Ambler has 45 years of experience at the intersection of technology, psychology, and quality assurance. Holding honours degrees in both Computer Science and Psychology, he has led teams, built QA functions, coached, mentored, trained, and currently assesses software testing apprenticeships for BCS.
He is a Fellow of the BCS, The Chartered Institute for IT, and a Member of the British Psychological Society, with senior-level experience across defence, telecoms, finance, transport, digital entertainment, and video games, as well as advisory work for GLS on venture capital investments.
His current focus lies where his two disciplines converge: artificial intelligence. His book, Psychology of AI Decision Making, examines how moral psychology can shape ethical and trustworthy AI.
Chris also collaborates closely with BCS, BPS, and the TBI to advance safe, moral, and globally responsible AI systems. He is dedicated to helping shape AI that doesn’t just 'think' faster but computes ethically.
Our events are for adults aged 16 years and over.
This meeting is conducted in accordance with the BCS Code of Conduct for Meetings.
BCS is a membership organisation. If you enjoy this event, please consider joining BCS. You’ll be very welcome. You’ll receive access to many exclusive career development tools, an introduction to a thriving professional community and also help us Make IT Good For Society. Join BCS today
Please note: if you have any accessibility needs, please let us know via groups@bcs.uk, and we’ll work with you to make suitable arrangements.
This event is brought to you by: Hampshire branch
Supported by: Dorset branch, Kent branch and Sussex branch