Martin Jewiss MBCS, Customer and User Experience Director at The Forge Partnership and Early Careers Advocate for the BCS Agile Methods SG, explains how organisations can dramatically improve the odds of success for their growth strategies and product innovation.
Only a third of well-formed growth strategies succeed. Product innovation success rates range from 5% to 17%. These are odds you can find at a roulette table. Yet research shows some organisations achieve 72% greater profitability, 41% faster revenue growth, and an 86% product innovation success rate. What do they have in common? They’re customer aligned, and they balance building things right with building the right things.
Building the thing right
Many organisations have shifted from traditional project management (skills, scope, budget, timescale) towards business agility. Complex projects with emerging requirements and changing external influences are often navigated using tools such as Scrum, Kanban, or Flight Levels for organisation-wide agility. Tactics such as aligning on common goals, prioritising by expected value, understanding what 'done' looks like, and identifying bottlenecks in value creation are now well established.
The shift came about because traditional approaches project an illusion of certainty. They assume we can define requirements completely upfront, that nothing will change, and that following the plan rigidly equals success. The illusion rarely survives the realities of changing requirements, unanticipated challenges and new insights gained during the project. Agile approaches acknowledge this complexity and provide mechanisms for responding.
But even with excellent execution, we can still end up with no valuable outcomes. Technical excellence plus adaptable project organisation does not equal business success. Strategies and products can still fail. Why is that?
Building the right thing
Customers are missing from this conversation so far. The Desirability, Viability, Feasibility (DVF) Framework is a cornerstone of IDEO's Design Thinking approach, developed in the 2000s. It helps evaluate potential and ongoing activities (such as strategy or product development) through three criteria: technical feasibility, economic viability and customer desirability. I like to add a fourth: cultural integrity (whether the potential activity aligns with or is authentic to the organisation’s internal culture).
The argument is straightforward: we dramatically improve our chances of creating valuable outcomes when we balance all three (or four) criteria. So how do we identify what our customers really need, especially when we have a limited understanding of them?
Discovery: sometimes overdone, often underdone
The typical approach is to begin with upfront discovery work. Contextual inquiry (observing customers in their environment), jobs-to-be-done analysis, customer journey mapping, user story maps, and innovation games can all be instrumental. They help us understand what customers need and want in an authentic context: not a better drill, the obvious solution for an organisation; they want a hole in the wall to hang a picture.
But traditional discovery has challenges. The outputs often gather dust, ignored on a delivery team’s office wall or forgotten in a digital archive, delivering no value whatsoever. They are only helpful when the lessons are communicated and adopted throughout the team. Ideally, everyone should understand and work in the service of customers.
We also see pushback to upfront discovery. Sometimes this is valid — too much effort for little return. Other times it comes from leadership attitudes like ‘we’ll focus on the business needs’ or ‘we know exactly what our customers want’ or simply ‘we just want to get started’. These attitudes originate from assumptions, often incorrect, that introduce a foundational risk upon which all subsequent effort is built.
The 40-70% rule is helpful here: any less understanding and we are relying too heavily on assumptions, any more and we are taking too long while the window of opportunity closes. Consider whether decisions are one-way or two-way doors. If it is quick and painless to reverse a decision, running an experiment is likely quicker and cheaper than discovery. If it would be costly or difficult to reverse, invest more in understanding first.
Objectives and key results
Whether we can do upfront discovery or not, the key approach is to test as we go and measure ongoing value to the customer. Customer-centric objectives and key results (OKRs) provide the mechanism.
For you
Be part of something bigger, join BCS, The Chartered Institute for IT.
and used to align the organisation (or at least divisions and teams) on the goals of a project or initiative.
Key results are quantitative, measurable and indicate incremental progression towards objectives. I like the phrasing Jeff Gothelf and Josh Seiden use: ‘Who does what by how much?’ This grounds the key results in customer or user behaviour. We identify behaviours that indicate value has been obtained by customers, put in place actions aimed at increasing those behaviours and measure the actual outcomes.
The critical distinction is between measuring outcomes and measuring outputs. Launching a feature is an output; customers using that feature to solve their problem and then switching from a monthly to an annual subscription is an outcome. The former tells us we shipped something; the latter tells us we created value.
With each measurement cycle, we can decide whether to continue with what we are doing, adjust our approach, or learn more about the challenge. If we need to learn more, we may need to revise our objectives during our next review. We will often want to change our actions or approaches, whether the objectives remain the same or change. This creates a continuous feedback cycle: understand, decide, deliver/learn, measure, repeat.
Regular, frequent measurement of outcomes (changes in customer behaviour) and the evaluation of our activities are essential. With clear strategic objectives, just enough deeper research into challenges and ongoing assessment of customer behaviours, we can navigate an efficient route to achieving our goals.
Proxy measurements, such as heatmaps and screen recordings, carry their own dangers. We interpret visuals based on assumptions which can be wrong. We are viewing through a partially obscured window.
Final thoughts
We still aim to balance customer needs with economic viability, technical feasibility and potentially cultural integrity. If we can get close to achieving that balance, we see an optimal flow of value creation throughout an organisation.
Returning to those success and failure statistics, if you are executing a growth strategy or developing a product based on unvalidated assumptions, how much of your own money would you invest? Remember, there is a 65-95% chance of failure. How would you invest in a customer-centric organisation hitting an average 86% success rate for product innovation or 72% greater profitability from growth strategies?
The choice between a bet and an investment often comes down to whether we build the right thing and make it right.
Take it further
Interested in this and similar topics? Explore BCS' books and courses: