On the 14 May Dr Carolyn Mair, a psychologist who currently works for the London College of Fashion, where among other things they run courses on fashion projects, and Professor Martin Shepperd of Brunel University, a luminary in software engineering management, gave a presentation to PROMS-G in London examining the problems of cognitive bias in the estimation of the costs and benefits of potential projects. Bob Hughes reports.

Generally, the figures show that IT practitioners tend to overestimate the benefits of projects and underestimate the duration / costs. In the latter case the average over-run tends to be about 30 per cent.

This was an excellent follow-on to April’s talk to the PROMS-G group in London by BCS author Mike Blackstaff, which warned us of the pitfalls that can undermine the financial appraisal of proposed IT projects. Mike had emphasised the need for a fact-based, quantifiable, foundation for the assessment of potential IT projects. However, he would be the first to say that judgements about project proposals can never be infallible because they are based on predictions about an uncertain future.

To achieve a good prediction about a new project, you firstly need to know all the relevant facts of the case. Often, you do not have time to find out these facts, or they are just not accessible, so you just have to make assumptions. You now apply these facts and assumptions to the future, where even those things that are currently facts may no longer apply. So no wonder there are problems.

A key lesson of the May talk was that all estimates need to be accompanied by an assessment of their degree of certainty. This may mean that you will need to supply a range of possible figures with one identified as the most likely. The narrower the range, the more certain you are. Take travelling to work each day - this is my example, not the speakers'.

You may know that usually the drive takes about 40 minutes, but on some days (e.g. during school holidays) it can take as little as 30 minutes, and on other days (e.g. because of a lorry breakdown) an awful lot longer. But, in general, the more time you allow yourself for the journey, the more certain you are not to be late. If you have a crucial meeting at work, then you’ll set out from home in good time, even though it may be at the cost of a second cup of coffee.

In the world of IT projects, project planners, when given an estimate in the form of a range, can select a particular task duration as the target taking account of the relative importance of the particular task not over-running.

Once we have acquired a skill, the evidence shows that we underestimate how long it will take us to do a task, but overestimate how long others will take. A survey in the USA, asked 11,400 chief financial officers to make predictions about the financial market. A comparison of their predictions with actual outcomes showed that you would tend to be better off assuming the opposite of what they predicted (‘a negative correlation’).

A particularly striking of example of bias is the anchoring effect where a preliminary piece of information (or just a suggestion) is given too much importance. You could be asked at work to estimate how long a task will take, and someone happens to comment ‘last time we did something like that it took a couple of months’. The circumstances of the previous example might have been completely different, and that ‘couple of months’ might in fact have been 12 weeks, but your thought processes will be overly influenced by that expectation. This anchoring effect was demonstrated very convincingly by a simple experiment with the audience.

Can we do anything about these biases? After all the whole point is that we are probably not aware of them. It may be possible become more aware of our biases through training. This might involve genuine critical evaluation of the performance of completed projects and more crucially an attempt to apply the lessons learnt to the planning of future projects.

If you are interested in these issues, the book by the Nobel Prize winner Daniel Kahneman, 'Thinking Fast... Slow', was highly recommended. If you are short of time the Wikipedia entry on ‘Cognitive Bias’ is pretty good.