Firstly though let me make it plain that I'm no procurement expert. There I defer to others. And I'm certainly no legal eagle. However let's proceed on the basis that a good contract represents a win-win. The client wants their new system loaded with quality data at a reasonable price, the contractor wants a satisfied customer and to make a profit.
At some point consideration of the sharing of risk comes into the contract negotiations. It is tempting for both sides to push back around the issue of risk. Obviously we expect the software to be usable and the responsibility for that sits with the supplier but what about the data and more specifically the usability of the data?
A canny negotiator on the client side can shift responsibility for data quality onto an unwary supplier by making time a subject of the contract - in other words by applying penalty clauses or liquidated damage clauses to the timely delivery of the full solution. However, as we have noted here before, data quality has to involve the client's business side as well as the supplier's technology. A supplier does not have the management clout, on their own, to get the knowledge out of the business and into the project. Especially not to an arbitrary timeline that might be specified in a contract, no matter how business critical that timeline is the supporting business case.
Quite quickly a responsibility gap will open up. On the one hand is the supplier, under the cosh of a severe threat to their profitability, and on the other is the business capable of slowing the project down without any responsibility or penalty clause. I am not suggesting here that the business is acting in bad faith, which would be something else again, but that the normal conservatism by the operational departments in the face of radical change can impede progress.
Most suppliers will protect themselves from being caught on the horns of this dilemma by, in one way or another, creating a demilitarised zone. On the one side the client is responsible for providing data that will load in an agreed format and on the other is the supplier's software, picking up, validating and loading. Any data that fails is errored straight back at the client.
Now this arrangement has been fine for sometime, but advances in the integration of Data Migration software is beginning to undermine the integrity of this ring fence. Supplier data landscaping software is used to reach out into legacy systems and more agile ways of working are blurring the responsibilities of the two parties as the vortex of co-operative working whirls round so much more quickly than a traditional waterfall technique. Also I think it is no longer responsible for suppliers, who wish to be taken seriously in this market place, to not be in a position to offer support to an inexperienced client in the full, end to end management, of a Data Migration exercise.
So how to share risk in a controlled manner? Well I have for some time been pushing the use of Data Quality Rules (DQR) as a mechanism for managing data quality issues in Data Migration projects. Because DQR were designed against a background of rapid application development (it was DSDM as opposed to Agile - but hey that probably tells you how long I've been doing this) they adapt easily to modern iterative approaches. On the other hand because they are designed to create a set of measurable steps with assigned responsibility, responsibility client side or contractor side, is traceable. They also tie in those in the business to the project and so bridge the responsibility gap.
By allowing the contractor to have an equal responsibility within the DQR process with a defined Data Stakeholder role, plus the accountability outlined above, we can write contracts that share the risks and rewards. Maybe stout fences make for good neighbours, as the poet said but throwing bad data back and forth over one is not a recipe for a successful migration.