The reasoning beyond why ethics should be modelled in a computational way. Followed by lightning talks.
Speaker
Dr Joe Collenette
Agenda
6pm Arrival and Networking
6.30pm Ethical Modelling and AI Talk
7:30 pm Lightning Talks and Networking
8pm Close
Synopsis
This talk introduces some of the reasoning behind why ethics should be modelled in a computational way, using examples to highlight the key issues and how they are addressed by the introduction of ethics to an AI system.
Highlighting some of the existing research to show various areas of Ethical AI and what these areas means for the developer, from development to deployment, and the wider public who will encounter these systems. In addition, the talk shows how interpretations of law to be used in AI systems shows similar challenges to incorporating ethical reasoning.
Examples include the modelling the highway code for autonomous cars, incorrect information generated by generative AI, as well as the widely discussed trolley problem ethical dilemma.
The talk aims to show that consideration for ethics and law is a necessity for some applications of AI, whether they are machine learning based or based in other areas of AI.
Please bring along a short slide deck – or just chat – if you would like to present something during the lightning talks, although it’s not mandatory!
Tea, coffee and light refreshments will be served.
About the speakers
Dr Joe Collenette achieved a PhD at the University of Liverpool, on the topic of "Simulated Emotions and Mood as part of Decision Making in a Mobile Agent Society".
Following this, Joe was an Industry Research Fellow focusing on AI and Law, publishing "Explainable AI tools for legal reasoning about cases: A study on the European Court of Human Rights" in Artificial Intelligence. Joe was then employed as a Research Associate at the University of Manchester working on formalising the UK Highway Code, and prospective responsibility in multi-agent systems.
Joe has also collaborated with worldwide researchers on conflicts within autonomous traffic agents and deception in LLM behaviour. Currently employed as a lecturer at the University of Chester, Joe works on safety within autonomous traffic agents and interdisciplinary influences on this design
Our events are for adults aged 16 years and over.
BCS is a membership organisation. If you enjoy this event, please consider joining BCS. You’ll be very welcome. You’ll receive access to many exclusive career development tools, an introduction to a thriving professional community and also help us Make IT Good For Society. Join BCS today
Please note: if you have any accessibility needs, please let us know via groups@bcs.uk and we’ll work with you to make suitable arrangements.
For overseas delegates who wish to attend the event, please note that BCS does not issue invitation letters. Image by Neeqolah-creative-works
This event is brought to you by: Chester and North Wales branch