Poor data can cost you money and get you sued, says Tony O’Brien MBCS, to prevent this there needs to be an alignment between data governance and overall corporate governance.

We are all aware of the old chestnut ‘Garbage In, Garbage Out’, but do the private and the public sectors pay any real attention to the quality of their data?

The costs and effects of poor data quality

Extensive literature has identified the high costs of low quality data and the cost of poor data quality recognising that firms may lose upwards of 10 per cent of revenues due to poor operational data, together with other serious consequential effects relating to tactical decision making and strategy generation.

The results of a survey undertaken by The Data Warehouse Institute back in 2002 estimated that data quality problems cost US businesses $600 billion a year in postage, printing and staff overhead costs alone.

A report published jointly by Dun and Bradstreet and the Richard Ivey School of Business in 2006 forecasted that critical data within at least 25 per cent of the Fortune 1000 companies would continue to be inaccurate and that ‘every business function will have direct costs associated with poor data quality’.

A survey conducted by the Economist Intelligence Unit in the same year reported that 72 per cent of respondents said their data was sometimes inconsistent across departments and that workers frequently made poor decisions because of inadequate data.

Larry English in his latest book published in 2009 outlined a catalogue of corporate disasters emanating from poor quality business information, amounting to a cost of ‘one and a quarter trillion dollars’. A Gartner report stated that: ‘through 2011, 75 per cent of organisations will experience significantly reduced revenue growth potential and increased costs due to the failure to introduce data quality assurance’.

Even where organisations appreciate the importance of quality data, there appears to be a lack of any real progress. A 2007 survey conducted by BusinessWeek and Hewlett Packard highlighted the fact that many organisations will readily agree that their data is an important asset, but fail to take any action.

Indeed, a PricewaterhouseCoopers survey a year later found that 70 per cent of the executives considered data to be an important asset, yet only 40 per cent felt they used it effectively. The quality of an organisation’s data not only has significant commercial connotations, but it also has serious implications for all enterprises, as they respond to the huge number of regulatory requirements.

Failure to comply fully can result in serious financial damage to an organisation or even threaten its very existence, even where fraud, deception or other misdemeanours are absent. This not only relates to prominent laws such as Sarbanes-Oxley and Basel II, but to the myriad of government demands for data, whether they be from HM Revenue and Customs, the Office for National Statistics, or any other governmental agency whether UK-based or international.

All have potential penalties for late delivery or erroneous information. There needs to be an alignment between data governance (as described below) and overall corporate governance. For any unified governance, risk and compliance (GRC) strategy to be successful, there has to be confidence in the quality of the inherent data.

The management of data and data governance

Data is both an organisational resource and an enterprise-wide asset, as valuable as any physical, financial or personal asset and therefore must be managed appropriately.

We need to establish first and foremost guiding principles around the management of data. To be really effective we also need to identify and focus on those data sub-sets that hold real value and / or potential risk, rather than attempt to manage ALL of the data fields, much of which will be of low priority. This should be undertaken by establishing:

Ownership: Who has actual ‘ownership’ or ‘custody’ of the data on behalf of the organisation as a whole and thereby has responsible for the ‘quality’.
Responsibility: Those persons who are directly involved in any way with the entry, extraction, manipulation of any part of the data (as data suppliers, processors or consumers).
Management: Ensuring operational availability, security and business continuity- IS department.
Everyone in the organisation.
Data policies: To be set by the organisation together with the ‘owners’ or ‘custodians’.

Data governance

The policy of treating data as an enterprise-wide asset assists in establishing a data governance strategy.

The concept behind adopting a data governance approach is to enable an organisation to create an environment within which data is controlled and coordinated. As with most successful enterprise-wide initiatives, data governance requires a mandate, ideally in the form of sponsorship from a leading executive.

Without a strong mandate for change, a data governance policy and indeed a data quality initiative, cannot hope to be successful.

Data governance refers to the overall management of the data in an organisation involving, not only the security and risks associated with the data, but also determining who are the true owners and custodians of the enterprise’s data assets, procedures, policies and processes; establishing the approach towards data quality and instilling a culture of data stewardship and quality throughout.

This is not just a data cleaning exercise but a culture change; the policies and initiatives need to be institutionalised so that they become part of the organisational fabric.

Taking action

Fundamental to a data quality improvement initiative are three conceptual elements namely: people, processes and data.

The inter-relationship between these three elements requires that any attempt to improve the overall quality of data in any organisation must be centred on people whether data suppliers, processors or information customers; the processes that receive, handle, action and pass on data and information; as well as the data itself where ever it sits within the data cycle of input, process and output.

Data quality improvement is not just about fixing data or improving quality within a single business application or process, but also about taking a more expansive and forward-looking enterprise-wide approach.

This must involve addressing cultural issues, initiating both short and long term process and procedural improvements by a step-by-step, incremental approach, whilst ensuring that the data conforms to appropriate specifications or requirements. In this way any improvement initiative has an opportunity to be sustained.

From a practical perspective an organisation can start ‘small’ and build upon quick wins and successes. Executive and senior management sponsorship is paramount; without this any improvement process will eventually fail. So you may need to build a business case.

Alongside this a project champion is required together with ‘internal champions’ to cascade the initiative(s) across the organisation. Take things slowly to ensure everyone is on board and identify who has the ownership / custody and responsibility as described above.

In this context it will be worth reiterating that data quality is NOT about IT but unequivocally a business issue.

Provide regular visible measures to monitor progress around the data and processes that are important to you in terms of costs and business risk and at the same time build data quality targets into peoples’ objectives.

Ascertain the root causes of issues and attempt to resolve problems at source. We are essentially looking at principles of quality management and the management of change.

Whilst this article cannot be more than a snap shot to renew awareness and initiate some actions, it may hopefully stimulate further discussion. As stated above many organisations realise they have a problem with their data, but do not appear to be doing a great deal about it.

(A longer version of this article appeared in the January 2011 edition of the Chartered Secretary magazine).

May I also refer fellow members to the works of probably the three leading proponents of data quality over the last twenty years, namely:

  • Rich Wang at MIT
  • Larry English at Infoimpact
  • Tom Redman at Navesink

You will find a wealth of rich material within their writings.

In addition two websites should also provide you with strong practical advice: