A study in project failure

Person with their head in their hands Dr John McManus and Dr Trevor Wood-Harper

Research highlights that only one in eight information technology projects can be considered truly successful (failure being described as those projects that do not meet the original time, cost and (quality) requirements criteria).

Despite such failures, huge sums continue to be invested in information systems projects and written off. For example the cost of project failure across the European Union was €142 billion in 2004.

The research looked at 214 information systems (IS) projects at the same time, interviews were conducted with a selective number of project managers to follow up issues or clarify points of interest. The period of analysis covered 1998-2005 the number of information systems projects examined across the European Union.

Number of IS projects examined within European Union

Rank Sector No. of projects examined
1 Manufacturing 43
2 Retail 36
3 Financial services 33
4 Transport 27
5 Health 18
6 Education 17
7 Defence 13
8 Construction 12
9 Logistics 9
10 Agriculture 6
Total   214

 

Project value in millions of Euros

Value range in millions (€) Number of
projects
Percentage
(%)
Accumulative
(%)
0 – 1 51 23.831 23.831
1 – 2 20 9.346 33.177
2 - 3 11 5.140 38.317
3 - 5 33 15.421 53.738
5 - 10 4 1.869 55.607
10 - 20 87 40.654 96.261
20 - 50 6 2.804 99.065
50 - 80 2 0.935 100.000
Totals 214 100.00 100.00

 

At what stage in the project lifecycle are projects cancelled (or abandoned as failures)?

Prior research by the authors in 2002 identified that 7 out of 10 software projects undertaken in the UK adopted the waterfall method for software development and delivery. Results from the analysis of cases indicates that almost one in four of the projects examined were abandoned after the feasibility stage of those projects completed approximately one in three were schedule and budget overruns.

Project completions, cancellations and overruns

Waterfall method
lifecycle stage
Number of projects cancelled Number of projects completed Number of projects overrun
(schedule and/or cost)
Feasibility None 214 None
Requirements analysis 3 211 None
Design 28 183 32
Code 15 168 57
Testing 4 164 57
Implementation 1 163 69
Handover None 163 69
Percentages 23.8% 76.2%  

 

Of the initial 214 projects studied 51 (23.8 per cent were cancelled) - a summary of the principal reasons why projects were cancelled is given below. Our earlier research elaborated on the symptoms of information systems project failure in three specific areas: frequent requests by users to change the system; insufficient communication between the different members of the team working on the project and the end users (stakeholders); and no clear requirements definitions. Whilst communication between team and end users was still perceived as an issue within some projects; the top three issues from this study are: business process alignment; requirements management; and overspends.

One notable causal factor in these abandonment's was the lack of due diligence at the requirements phase, an important factor here was the level of skill in design and poor management judgement in selecting software engineers with the right skill sets. Equally the authors found some evidence in poor tool set selection in that end users found it difficult to sign-off design work - in that they could not relate process and data model output with their reality and practical knowledge of the business processes.

Key reasons why projects get cancelled

  • Business reasons for project failure
  • Business strategy superseded;
  • Business processes change (poor alignment);
  • Poor requirements management;
  • Business benefits not clearly communicated or overstated;
  • Failure of parent company to deliver;
  • Governance issues within the contract;
  • Higher cost of capital;
  • Inability to provide investment capital;
  • Inappropriate disaster recovery;
  • Misuse of financial resources;
  • Overspends in excess of agreed budgets;
  • Poor project board composition;
  • Take-over of client firm;
  • Too big a project portfolio.

Management reasons

  • Ability to adapt to new resource combinations;
  • Differences between management and client;
  • Insufficient risk management;
  • Insufficient end-user management;
  • Insufficient domain knowledge;
  • Insufficient software metrics;
  • Insufficient training of users;
  • Inappropriate procedures and routines;
  • Lack of management judgement;
  • Lack of software development metrics;
  • Loss of key personnel;
  • Managing legacy replacement;
  • Poor vendor management
  • Poor software productivity;
  • Poor communication between stakeholders;
  • Poor contract management;
  • Poor financial management;
  • Project management capability;
  • Poor delegation and decision making;
  • Unfilled promises to users and other stakeholders.

Technical reasons

  • Inappropriate architecture;
  • Insufficient reuse of existing technical objects;
  • Inappropriate testing tools;
  • Inappropriate coding language;
  • Inappropriate technical methodologies;
  • Lack of formal technical standards;
  • Lack of technical innovation (obsolescence);
  • Misstatement of technical risk;
  • Obsolescence of technology;
  • Poor interface specifications;
  • Poor quality code;
  • Poor systems testing;
  • Poor data migration;
  • Poor systems integration;
  • Poor configuration management;
  • Poor change management procedures;
  • Poor technical judgement.

What is the average schedule and budget overrun?

In examining the cases it was noted that the average duration of a project was just over 26 months (115 weeks) and the average budget was approximate 6 million Euros, (Table 5). In many instances information on a project being over schedule and over budget will force senior management to act, however, the search for the underlying factors should begin else where in the projects history.

The pattern that emerges from a synthesis of case data is complex and multifaceted. In a few of the of cases examined the project commentary and history was ambiguous; however, once a decision had been made to support a project which was over schedule or over budget the ends usually justified the means irrespective of the viewpoints of individual project managers or stakeholders.

Cost and schedule overruns (N=69)

Projects
From Sample
2
(2)
11
(13)
19
(32) 
25
(57)
12
(69)
Schedule
Overrun
 11
weeks
29
weeks
46
weeks
80
weeks
103
weeks
Range Average  Budget + 10% Average  Budget + 25% Average Budget + 40% Average Budget + 70% Average Budget + 90%
Cost Overrun €600,000 €1,500,000 €2,400,000 €4,200,000 €5,400,000

What are the major causal factors contributing to project failure?

Judgements by project stakeholders about the relative success or failure of projects tend to be made early in the projects life cycle. On examination of the project stage reports it became apparent that many project managers plan for failure rather than success. 

If we consider the inherent complexity of risk associated with software project delivery it is not too surprising that only a small number of projects are delivered to the original time, cost, and quality requirements.

Our evidence suggests that the culture within many organisation's is often such that leadership, stakeholder and risk management issues are not factored into projects early on and in many instances cannot formally be written down for political reasons and are rarely discussed openly at project board or steering group meetings although they may be discussed at length behind closed doors.

Despite attempts to make software development and project delivery more rigorous, a considerable proportion of delivery effort results in systems that do not meet user expectations and are subsequently cancelled. In our view this is attributed to the fact that very few organisation's have the infrastructure, education, training, or management discipline to bring projects to successful completion.

One of the major weaknesses uncovered during the analysis was the total reliance placed on project and development methodologies. One explanation for the reliance on methodology is the absence of leadership within the delivery process. Processes alone are far from enough to cover the complexity and human aspects of many large projects subject to multiple stakeholders, resource and ethical constraints.

Although our understanding of the importance of project failure has increased, the underlying reasons still remain an issue and a point of contention for both practitioners and academics alike. Without doubt there is still a lot to learn from studying project failure.

Going back to the research undertaken there is little evidence that the issues of project failure have been fully addressed within information systems project management. Based on this research project failure requires recognition of the influence multiple stakeholders have on projects, and a broad based view of project leadership and stakeholder management.

Developing an alternative methodology for project management founded on a leadership, stakeholder and risk management should lead to a better understanding of the management issues that may contribute to the successful delivery of information systems projects.

June 2008

Comments (21)

Leave Comment
  • 1
    Shashi Kadapa wrote on 19th Mar 2010

    Sir, a very nice and informative article and really surprising that inspite of these findings, projects still continue to be initiated without any risk management strategy.

    The Waterfall method is a very robust methodology and it is surprising how its misuse has led to increased project failures.

    The study was done during the years 1998 - 2005, when the IS developments methods were still not mature. Perhaps an updated research finding by the respected authors would show if the condition has worsened or reduced.

    Report Comment

  • 2
    Andrew Briggs wrote on 20th Mar 2010

    This article reads as if no such alternative methodologies for software project management exist. I would be interested to see the statistics for the projects which are employing non-sequential development models.

    "The Waterfall method is a very robust methodology" - Maybe, but it's rarely suited to software development, and yet still seems to be the default choice for many projects. To me at least, it's not surprising that it's use often results in project failures.

    With regard to whether the situation is improving, this interview with the chairman of the Standish Group would certainly suggest that things are getting better:
    http://www.infoq.com/articles/Interview-Johnson-Standish-CHAOS

    Report Comment

  • 3
    regis makoni wrote on 10th Jun 2010

    A jolly good article. I am from Zimbabwe.

    Report Comment

  • 4
    Tat Keong wrote on 23rd Jun 2010

    This is for the authors of the article. I am looking into a more "user-oriented" approach to Project Management and processes within a project to reduce failure. Would like to discuss with you, if you are interested.

    Report Comment

  • 5
    Barry Deane wrote on 2nd Jul 2010

    When managers and business owners are asked to invest in IT, often heard is reflection that goes something like, "...the software is the 'enabling' tool only. If the processes that the software is to support aren't in place, the IT system won’t do any better.” This is to say that if we can’t do the work effectively before we IT enable it, we won’t do any better with IT.

    Some will say that this is a simple and obvious reflection. However, this really is the point that remains unattended after many years of technical advance and IT-centric 'break or break through' approaches to aligning the organization with the 'enabling' technology.

    Could it be that there is a whole school of thought - a whole specialization - missing? Could it be that no matter how hard we try as IT technologists and process specialists, that our theorizing about what makes people tick, what makes organizations tick etc might be wide of the mark? Consider these
    quotes (from a longer passage):

    "The absence of a language, concepts and a general theory of administration (i.e. organization)* is a serious impediment to the efficiency of industry.....

    In the absence of a body of knowledge which can be taught, training has to rely on the
    uncertain process of 'learning by doing'. This is a 'hit or miss' approach, which is just as capable of leaving minds fogged by fantasy notions and unreal ideas as of instilling sound knowledge....

    What is the present state of explicit conceptual knowledge about administration in general? There is no general theory, and a great paucity of concepts and hypotheses about managerial processes.....

    The study of administrative methods is the study of people at work, their behavior, their relationships, the way work is split up between different roles, and the often unrecognized social institutions which companies have established and are using....I would ask you to consider the following: Are you quite sure that your notions about your own practices are consistent with the reality of what really takes place in your own company....?"

    Wilfred Brown 1960 'Exploration in Management'
    * My clarification

    Way back then (i.e. 1960) Brown (a long-time CEO writer, researcher and industry policy maker) was drawing our attention to the pressing need to understand the nature of human work and of people at work; of organizing, managing and leading; of task assignment, performance management; of structure and levels of work; of role definition and role relationships; of systems and processes.

    More particularly, his work and that of his collaborators was concerned with understanding and defining the total system of organization - a system which is many-faceted, with interlinked
    constructs, principles and practices; it could be said, a non-linear, 'socio-technical' system.

    Trying to understand why and how employees may work with or 'work around' a new ERP system is, first and foremost, informed by an understanding of the 'institutional conditions' of the workplace/organization that they find themselves in.

    More often, however, our starting point is to find fault with the resisting individuals - to have them removed; to measure them against a KPI; to 'make them' follow through a particular process by the way a screen is written; to classify them as non-compliant in a whole range of different ways. Sure,
    individuals must perform and behave to agreement. However, the starting point must be to understand the organization; its design and how we deploy it.

    With the growth of IT apps beyond personal computing, into the enabling of collective work, the IT industry has approached an inevitable organizational confrontation; i.e. trying to understand the 'people' issues using 'common sense' together with the readily available and 'proven' 'process' tools - many of which came from the Deming days of process analysis and improvement - and the many consultant-driven derivatives (e.g. Quality Circles, TQM, Six Sigma).

    Whilst the specific 'process-related' knowledge generated is undoubtedly solid, the strong tendency to add on ideas about how to execute (e.g. how to organize; how to lead) have got buried inside them significant and untested assumptions (and serious fallacies) about organization design and deployment - and the natural laws that we now know govern these matters.

    The IT approach to executing; to implementing new systems, quite often includes the advocacy of an alternate organization within the client company - which has inevitably poor consequences.

    Unfortunately, many of the untested (i.e. no science or evidence) organization/management/leadership ideas that have been generated in the process improvement/IT industries have been around for so long now that they have become 'received wisdom' and are tightly held, shared ideas. They are difficult to challenge - especially in the high-pressure world of major IT projects.

    Where these ideas (i.e. about the cause and effect of organizing and managing) are fallacious and shared (i.e. agreed to by a group of like-minded people) a maladaptive culture will be formed. Signs to watch for in a maladaptive culture are high rates of project failure; inability to forecast completion,
    scope, budget; inability to drive efficiency (i.e. because we don't have effectiveness); people continuing to speculate widely about the cause and effect of success and/or failure; unforeseen disasters; unforecastable contingencies/risks.

    Part of a recent debate in the IT industry has centered around 'people' and 'organization' having something to do with success/failure of IT projects (i.e. IT apps enabling collective work). As part of this debate, the acronym OCM has appeared (i.e. Organizational Change Management) with the suggestion that this should be the starting point for an IT system implementation. That's good; I agree. But what is OCM; really, what should it be? What is to be done as OCM; on what basis?

    My starting comment warns that if the processes aren't in place, supported, documented etc then, "...you are setting yourself up for failure". I agree, but what to do about it?

    Some very good work has been done - and by fully functioning real-life companies - that goes to the heart of these questions about organization, managing and leading. Not much of it can be found in academia (academics are simply not managers). However , there is a deeply researched, tested and solid (and practical) body of work available. It was started by Wilfred Brown (now deceased).

    Anybody who is interested in this (my sincere apologies to those who might have got this far without getting any value from my remarks), should start a web search on Wilfred Brown, Glacier Metals and go from there. In particular there exists an extensive annotated research bibliography of the core research together with records of related research. This can be found at globalro.org and downloaded free.

    Report Comment

  • 6
    Stan Yanakiev wrote on 22nd Jan 2011

    "2002 identified that 7 out of 10 software projects undertaken in the UK adopted the waterfall method for software dev." - Is anybody still surprised that Waterfall does not work for Software Development?

    Report Comment

  • 7
    Philip Schwarz wrote on 22nd Jan 2011

    Many projects are seen as failures because success is defined as on-time, on-budget and with most of the expected features.

    Rather than saying that a project is failed because it is late, or has cost overruns, Martin Fowler argues that it's the estimate that failed.

    See Fowler's "What is Failure": http://martinfowler.com/bliki/WhatIsFailure.html

    Report Comment

  • 8
    Philip Schwarz wrote on 22nd Jan 2011

    Steve McConnell said:

    "We often speak of the software industry’s estimation problem as though it were a neutral estimation problem - that is, sometimes we overestimate, sometimes we underestimate, and we just can’t get our estimates right. But the software does not have a neutral estimation problem. The industry data shows clearly that the software industry has an underestimation problem. Before we can make our estimates more accurate, we need to start making the estimates bigger. That is the key challenge for many organizations."

    Report Comment

  • 9
    Paul Cook wrote on 2nd Feb 2011

    A long overdue report and study, well done chaps.

    Report Comment

  • 10
    Grant (PG) Rule wrote on 3rd Feb 2011

    Albert Einstein once said "The definition of insanity is doing the same thing over and over again and expecting different results".

    I know that has been said before, but I think it's worth repeating... because for nigh on 60 years organisations have tried to shoehorn the work of software product & process design into the project mold (to mix my metaphors). Many folk seem set on the idea that, if only the developers of software-intensive systems would be more disciplined, and apply well-defined project management methods more rigorously, then more projects would be counted as successes by their stakeholders.

    I've spent over 38 years in the software field, and the relative proportion of failed cf. successful projects has continued to give concern. The figures fluctuate, but the results are clear. There is very much waste, and many dissatisfied stakeholders.

    The project concept has become 'a hammer' for many software professionals... as in the proverb, "To a man with only a hammer, everything looks like a nail!"

    Steve McConnell is correct (of course). There is an inherent problem of underestimation. But that is merely one symptom, not a root condition.

    In 1982, Daniel D. McCracken and Michael A. Jackson wrote: "systems requirements cannot ever be stated fully in advance, not even in principle, because the user doesn't know them in advance, not even in principle". I've found this to be all too true. It is just one of the root conditions that lead to endemic underestimation (and other problems).

    Another root condition is that in many firms, a project manager, employed for their 'project management skills & expertise', is expected to produce a plan, define Product & Work Break-down Structures, and allocate tasks to 'the developers'. Just as if 'the developers' had not been employed for their skills and experience. Where is the entrepreneurial leadership & respect? the responsibility-based planning? the evidence-based decision-making? the exploration of solution options by skilled engineers? Hence, we end up in the situation where: a) crucial decisions are made early when knowledge of the problem and potential solution options are most sparse, leaving only opinion, preference & bias; b) a manager (i.e. non-engineer) dictates what tasks will be performed, and the sequence of their execution; c) developers are de-motivated and do not feel responsible either for the results of the assigned tasks, nor any related estimates; d) executives try to re-motivate workers by setting targets and offering bonuses for 'good people', in ignorance of research that suggests that extrinsic motivators are counter-productive for cognitive, creative work. Etc.

    Perhaps more people should try a different approach to the development of software systems and the related operational value streams?

    If you are interested, I suggest flow production and lean systems thinking as applied to the Integrated ICT Value Stream provides a far more effective solution option.

    Report Comment

  • 11
    Peter Freeth wrote on 15th Apr 2011

    Earlier this year, we uncovered over £16 Million in lost revenue for a global engineering company that directly resulted from a lack of management control. The senior management team had conspired to hide project over-runs and budget excesses in their monthly reports, and their problems spiralled, month after month. When they could have dealt with their management problems directly, they chose to hide them instead. Project failure was the symptom, the problem was the management team's inability to control their staff.

    Report Comment

  • 12
    Marten Eisma wrote on 10th May 2011

    What I wonder are based upon the following statements:
    1. A lot of IT projects are just investments in new products / services. So why not compared with investements? Then cancelling a project in the earlier lifecycle is a sign of good management and thus a succes for waterfall.
    2. Marketing uses a norm of 50% of the money spend is wasted. This is fas below what I now see for IT. So why call this then a problem?
    3. I see a budget overrun of 15% is mentioned as a failure. I don't see why. If a new product / service is a succes then we spend more money / time upont the same project (Because this is easier then to define a new project). Has this been ruled out?

    Report Comment

  • 13
    Mathias Holmgren wrote on 30th Nov 2011

    When is someone going to make a study of project success?

    Studying failure while surely providing opportunity for learning, is also a sure way of breeding fear, which will stifle several components that lead to software project success.

    IMO learning how to succeed requires learning from people who consistently succeed.

    This seemt to me to be self evident in all areas, except business. For instance, if you want to become the best in the world in tennis, you study the best tennis players.

    For some reason, in business the approach is to study all the bad tennis players and the mistakes they make and make a list and then think about how to fix each mistake.

    There is something fundamentally very, very wrong about that approach.

    Report Comment

  • 14
    Peter Bowen wrote on 2nd Dec 2011

    I think we are being a bit hard on ourselves here, why would pulling the plug on a project that was not going to be successful a failure when it was cancelled at the feasibility stage? Why is a project a failure if it goes over budget, when the budget was set before anyone knew what the solution would be? Surely a project should be hailed a success if it delivers value and provides good ROI and the project manager has managed expectations.
    I do however agree that the key to successful projects is leadership with leaders applying methodologies and techniques in the right way. But, given all the things that can cause delay the one thing that is certain is that there will always be failure.

    Report Comment

  • 15
    MUCHESIA PROTUS wrote on 23rd Apr 2012

    It is a wonderful document only that the people to implement them on the ground have vested interest in the projects.I will still advise that we put in practice what we have learned from such findings. I am a student at Bondo university(Kenya) undertaking Master of Information System.

    Report Comment

  • 16
    Nidhi wrote on 11th May 2012

    very informative article, I too intend to do some work in this area. one need to undersand how to minimize wastge in terms of time, money and man power

    Report Comment

  • 17
    Appeos wrote on 26th Sep 2012

    Interesting analysis and all too common.

    I'm researching ways to improve the success rate of complex IT development projects and I wonder if there is an updated version of this report.

    I'd be most grateful if anyone with insight into the issues raised above would make contact, so that I can better understand the issues and help to mitigate the risks.

    Steve Jones
    @Appeos
    appeos.com@gmail.com

    Report Comment

  • 18
    iah wrote on 5th Feb 2013

    Great article. Sometimes projects requirements are done by a different technical team. The consultancy is provided by a different team and later the project is handled by a different team. So information is not communicated. Some are missed which leads to the end result to be a different product.

    Report Comment

  • 19
    Tom Welsh wrote on 28th May 2013

    Having worked on 12 IT projects in different countries, all funded by international agencies, I recall two partial successes and the rest failures with one such a failure that it has been financed at least three times since!

    Report Comment

  • 20
    qadit wrote on 20th May 2014

    Loved this post and shared it with all my colleagues. Thank you so much!
    IT Security India

    Report Comment

  • 21
    ikiwisi wrote on 17th Oct 2014

    I like the (accidental?) irony of comment #10:

    Albert Einstein once said "The definition of insanity is doing the same thing over and over again and expecting different results".

    I know that has been said before, but I think it's worth repeating...

    Report Comment

Post a comment