This had gone down less and less well as the message went back up the line to the nervous execs whose jobs were linked to the successful outcome of the bigger business case.
They favoured a more staggered approach.
Move over one of the bigger more complicated customers first, one that will stretch all the features in the new system, then introduce the rest little be little as confidence grows.
This was causing all sorts of design challenges to the IT guys. What about linked systems? What about accidental double keying? Audit? Etc. etc.
I introduced them to the concept of synchronisation. With synchronisation changes made in one system are replicated in the other. It comes in (at least) two flavours - forward synch where changes made in the source systems are replicated in the target, and bi-directional synch where changes made at either end get replicated in the other.
Pause a moment and you can see that synchronisation makes a big difference to your thinking. Fallback becomes a cinch. Our friends introduced above can satisfy their bosses without pushing the limits of their design ingenuity. The users can work off the migrated data but because it is reverse synched with all the old systems all the existing integration and reporting will continue to work. We can migrate during system up time using back-ground low priority threads. Erroneous keying into the "wrong" system will not matter because it will be replicated in the new and so on.
It is certainly a big step forward.
And then I read the press release from Golden Gate software - check it out at:
I'm sure it's a good thing that more and more companies are being drawn to the world of Data Migration. They all have something to offer. Golden Gate is best known for its data integration software that is the backbone of many highly available platforms (well it is to me any way). They have been experts at data replication for years.
So it can only be one small step from data replication to data migration right?
Well sort of - but before we get to the caveats let's extol the virtues eh?
I have to say that this is based only on a reading of a Bloor Research paper available from the Golden Gate site at:
But anyone who knows more please add your comments. (By the way you have to create a user ID to get access to their papers but I consider that a fair quid quo pro because they have a lot of good stuff on offer).
So what is it that they offer? Well they're leveraging their transform and load capabilities plus their Veridata product that performs in-line comparisons of source and target tables to find synchronisation differences. These can then be used drive bi-directional synch.
Seems like a big step forward in migration software? And for a subset of migrations it probably is. For data consolidation and simple upgrades to COTS packages this solution would be fine - especially where you have the software available on site and are already skilled in it's use. However for more complex environments, where there is greater data consolidation and a more elective approach to the data to be migrated, ie where you are involved in a migration that may be prolonged over a number of months - like the guys I was talking to at the top of this blog - there is something missing. Leaving out the need for Landscape Analysis and Data Quality, which could presumably be bought in and bolted on, what is missing here is the intelligent controller.
In the architecture of a Data Migration solution the controller orchestrates the extracts and updates, including the selection criteria, reports on the state of data in transition, manages fallout and rollbacks etc. There are different flavours of controller but the functionality, if not the name necessarily, really has to be present so that people like the folks I was talking to earlier can pace their migration at the rate that suits them, watch the flow of activity, and deal with fall out.
So a full solution from GG? Well probably not but an interesting take on a persistent problem. And if I find out any more I will let you know.
About the author
John Morris has over 20 years experience in IT as a programmer, business analyst, project manager and data architect. He has spent the last 10 years working exclusively on data migration and system integration projects. John is the author of Practical Data Migration.