AI moves fast. Ethics doesn’t. But can we afford to sideline principles for the sake of performance? Giles Lindsay FBCS FIAP FCMI, Vice Chair of the BCS Fellowship Committee, explores.
AI isn’t coming. It’s already here. Across every industry, teams are experimenting with automation, smart assistants and machine learning tools. Yet behind the hype, there’s a quieter struggle. Can we move quickly without compromising what matters? Can AI be simultaneously agile and accountable both to the people it affects and the organisations that deploy it?
Urgency vs responsibility
AI is no longer just a tech issue — it's now a boardroom topic. Senior leaders are under pressure to implement AI across operations. They want speed, innovation and measurable returns. But at what cost?
In the push to act quickly, ethics can feel like a delay. That tension is growing. If agility means adapting fast, and ethics means taking care, do they pull in opposite directions? Or can they work together?
Many assume ethics will limit delivery. But what if it protects it instead?
The wrong starting point
When boards ask, ‘How do we stay ahead?’, they often begin with tools. That skips a step. The real foundation is trust.
Governance comes before growth. Strong ethical thinking is not bureaucracy, it's clarity. It helps delivery teams avoid costly mistakes. It lets leaders act with confidence.
A recent board discussion I attended made this clear. Senior executives were torn between ambition and risk. Some worried that introducing AI controls would block progress. Others asked, ‘What happens if we don’t?’
This isn’t about choosing between performance and principles. It’s about linking them so both succeed.
Speed without trust can backfire
You can launch a new tool in a week. You can’t rebuild trust as quickly.
We’ve seen the damage from AI that cuts corners. From biased recruitment software to unaccountable content moderation, the list is long. These aren’t technical faults; they are signs of weak oversight.
Rushing without checks creates risk. That includes legal risk, customer backlash and internal friction.
To move with pace, delivery teams need freedom. But that freedom must include boundaries. Governance isn’t a hurdle. It’s a guide.
Practical ways to embed ethics into delivery
Organisations often ask, ‘How do we build responsible AI without slowing down?’ Here are three practical steps:
- Build shared understanding: many teams want to do the right thing. But they don’t always know what it means. Make time for short training sessions. Cover bias, transparency, and how to spot risk. Aim to build shared language and awareness.
- Introduce simple checks: create short prompts, not long forms. Ask: where is the data from? How does the model make decisions? What happens if it gives the wrong result? These checks can fit into daily work and agile cycles.
- Make it part of team culture: ethical design isn’t just for specialists. It should show up in everyday behaviour. Leaders should speak about it. Teams should raise concerns without fear. Good practice should be celebrated.
Two examples, two outcomes
Take two similar organisations rolling out AI tools for customer support.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
In the first example, initially, the team delivers quickly. However, they overlook a crucial issue: the model assigns lower ratings to queries from older users. Complaints follow, the feature is pulled. Trust dips.
In the second, the team build in short checkpoints. They catch a similar issue early. They fix it before release. The launch is smooth, the outcome fair. The regulators take notice and praise the approach.
Neither team was trying to cause harm. The difference was how they worked.
Governance is behaviour, not a PDF
A written policy helps, but it’s not enough. Agility depends on behaviour, not just structure.
Ask:
- Do teams know when to raise a concern?
- Are ethical issues logged, reviewed and shared?
- Do staff feel safe to ask hard questions?
Real agility comes when teams understand their limits and still deliver. That’s only possible with clarity, and clarity depends on governance that’s lived, not just written.
Your role as a tech professional
If you work in delivery, product, engineering or data, your voice matters. You may not run compliance, but you influence how tools are shaped.
What you can do:
- Ask questions early in the process
- Flag blind spots that others may miss
- Offer simple ideas to build confidence in new tools
You don’t need to have all the answers. Just showing that you care about outcomes helps build trust across the team. Others will follow your lead.
Even a short question in a stand-up or a note in a retro can shape direction. Influence doesn’t always come from hierarchy. It comes from curiosity and care. The more voices that raise ethical concerns, the stronger and more sustainable your delivery becomes.
Let’s not wait for regulation
Formal rules will catch up — but most organisations can’t afford to wait. Getting ahead of regulation isn’t about guesswork. It’s about showing that your organisation takes responsibility seriously.
This applies across sectors. Whether you’re in finance, healthcare, or education, trust and transparency are central. That’s especially true when using AI tools that influence decisions.
The cost of getting it wrong is rising. So is public awareness.
This is not about perfection
Responsible AI isn’t about getting every call right. It’s about knowing when to stop and ask, is this fair? Is this safe? Is this aligned with what we stand for?
Every tech leader will face those questions. Mistakes will happen. What matters is how we respond. A culture of learning beats a culture of blame. When teams reflect on what went wrong and why, ethical maturity improves alongside delivery performance.
When ethics is embedded, you move with purpose, not just speed.
Final thoughts: agility needs guardrails
The pressure to deliver with AI is real. So is the risk of skipping steps. It’s tempting to think ethics slows things down, but the opposite is true.
When teams have strong guidance, they move faster. When they share the same values, they spend less time second-guessing. When leaders model what responsible delivery looks like, it creates space for progress, not fear.
Ethics isn’t a delay. It’s how we protect our people, our users and our reputation. And it’s how we make sure the AI we build today helps, not harms, tomorrow.
Read the white paper AI Ethics and Governance for Organisational Agility by Giles Lindsay FIAP FBCS FCMI.