Last week I was speaking at the Data Surgery run by the BCS Data Management special interest group and obviously took the chance to drop in on some of the other sessions.

The two that caught my eye (other of course than my own splendid contributions) were the sessions on big data and on how you put a value on data quality so that you can sell it internally thereby getting your initiative funded.

This second clearly resonates with the work I have been doing recently with Experian. I was going to start with big data but given that my initial thoughts on BD ran to nearly 1500 words, and this is supposed to be a blog not a thesis, I will leave that until next time.

The sessions were structured as a short presentation followed by a lengthy Q&A discussion, rather than the more standard long presentation with a few questions at the end. Julian Schwarzenbach of DPadvantage (and principal host behind the event) presented on a cost benefit view of data quality. 

He argued that we should look to put a price on the costs of poor data and the benefits of good data and only when these are overwhelmingly positive in the matrix that your organisation values (payback, internal rate of return, net present value etc.) should we go to our masters with a plan. All of which reminds me of a presentation I saw by Larry English quite a few years ago on the cost of scrap and rework and his Total Data Quality framework.

On the plus side of the account were items like exploiting new sales opportunity, decision making, external reputation, operational efficiency and risk reduction. And here I begin to see the problems. For instance, if we are to exploit new sales opportunities the sales director may salivate but he may also ask - what new opportunities? It would be a bold person who could put a hard figure on that value. Similarly with decision making - unless of course you are aware of and can prove that bad decision making has occurred because of erroneous data. 

Reputation and risk reduction are also difficult to put a monetary value on.

It is in the area of operational efficiency that I suspect, in most organisations, that it is possible to show where the hard cash is to be found. To use Larry’s metaphor, what is the cost of scarp and rework?

Unfortunately the costs of large scale process re-engineering for data quality efficiency are easier to cost than the benefits. Although I agree that getting the folks who capture the data to be more accurate may not, of itself, cost much in terms of capital expenditure, the constant reiteration of the message from middle management to front line staff, the monitoring and feedback is all very draining.

All of which suggests to me that this is, in the parlance of some of the early game theory, a maxi-max approach. There are big wins to be had but at a relatively large maximum cost.

Now I’m the data migration guy, so I (fortunately) don’t have to make this sale but it seems to me that another possible approach is the mini-max approach. Look for lower risk activities (at least initially) that will bring in tangible rewards. In other words, in place of the big sell look for tactical opportunities to make a tangible difference that helps build up your case for the big sell.

Which brings us back to data migration. We need good quality data. We have the compelling event of a go live with £n million of the company’s money invested. If you get the chance, jump on board with us, make that difference then use that as a springboard for future data quality success. 

And this is what I will be talking about next week on the Experian sponsored Webinar 'The Data Game Changer: The dimensions of a Data Migration'

In it I will show you how a well run data migration project gets you in front of the very people who most want to impress and how to show value that will get you a hearing post migration.

Please join the conversation.

Johny Morris

Follow me on Twitter: @johnymorris