I was reminded of the work of the late French post - structuralist philosopher Michael Foucault the other day when I met with the guy's from Rever.

This was probably not their fault. Except they do have the slogan "Decode the past" in their presentation.

For those who are a little hazy on one of the great twentieth century thinkers on semiotics and historiography Foucault published a number of works in his lifetime culminating in "The History of Sexuality" which came to the UK in translation in the mid 1980's. Foucault's project was to trace the history of concepts of deviance like criminality, mental illness and sexuality from ancient times through the Reformation and the Enlightenment to his contemporary late twentieth century western world. The way he did this was to vision himself (and by extension the reader) as an archaeologist of ideas, digging back through history, looking at the way that, say the definition of insanity, was signified in the various different times. Given his Marxist origins, these signifiers were as much in the public controls and uses of the individuals concerned as in written word.

So what has all this no doubt interesting digression into obscure French Philosophy got to do with Data Migration you may well ask? And what has it to do with the poor guys from Rever who only came over from Belgium to show us Brits some very interesting software?

Well semiotics – the study of the relationship of symbols (including words) to meaning as well as the study of the formal relationship of symbols to one another, is pretty much what we do in our day jobs when we are confronted with a large legacy data base that has been running a complex major company for decades. As I have pointed out before if a company has been running a large application for, say, twenty years, with a user population of, say, one thousand spread over a dozen countries it will have 20 thousand person years effort represented in its data. Not all off this effort will have been well directed and (to borrow from Foucault's guidance) much of it will have changed use and meaning over time and location.

Also a lot of meaning will have been embedded in the actions of the people using the data as opposed to the symbolic representation itself.

All of which leads us back to Rever who, based on more than 60 man years effort from Namur University, have a product that will excavate both the database and the human processes as they are instantiated in code to produce a semantic understanding of the data and it’s use. Reviewing code is essential – especially in many client server applications where the DMBS holds only a part of the referential integrity constraints, the rest being in the code.

The output is produced at a conceptual, a logical and a physical model level. This is significant. A physical model is needed to create mappings but to compare two data sources – one maybe IDMS the other perhaps COBOL flat files or Oracle tables – needs a level of abstraction a layer higher than the physical if we are to understand and compare it. And when we are talking to the business we need a layer higher still – the Conceptual level.

Using what they call "Modelisation" through a second model, a set of mappings can be produced automatically. Because Rever works at the logical model level these mappings are free of the redundancies normally associated with transferring data between data bases of different formats.

Indeed Rever can create a new database schema from the logical model in a new format, so an IDMS database could be transformed into an Oracle database but without the replication of structures that are only necessary for IDMS. It could then produce the mappings. So if this is the sort of re-platforming exercise you are interested in, Rever may be your answer.

This piece of software certainly ranks in the disruptive category of new software offerings. If it delivers what it promises it could help perform a task that I've been doing by hand for years – structural gap analysis using formal metadata methods. It will allow us to go deeper and further far more quickly. I am still to be convinced that its owners fully understand the requirements of data migration programmes in the round though, at least as it was presented to me. There is more needed for a migration controller than the ability to write out data mappings. There is scheduling, flow control, understanding of units of migration, fallout management, selection etc. etc. There is also field level data quality analysis which Rever accept is not part of their product offering.

There is also the issue of understanding how Rever might fit into a bigger programme in terms of the none technical issues. Going back to Foucault one of his key insights was that he demonstrated that the way individuals acted (in the case of madness incarcerating or not people deemed to be insane) was itself a statement of definition of their understanding of the nature of madness. In our more limited worlds we need to link into the actions of our business colleagues to understand the meaning of the data to them. The semiotics is not confined to the symbols in the database and their formal use as captured in computer programs. The use of conceptual models is one step along the way of meeting that understanding, however what is also needed is a set of actions within our Data Migration methodology that reaches out to the actors in the real world of the business to inhabit their view of what they do so that we can re-represent them appropriately in the new world we give them in their new systems.

This is why when we go through our cycles of Data Quality Rules and Key Data Stakeholder Analysis it isn't just to manage faults, or to be warm and fluffy about engaging with the business. It is fundamental to developing the depth of understanding in the programme that means the responsibility gap does not develop.

I'm looking forward to (hopefully) further engagement with Rever but they can be contacted at www.rever.eu

And are well worth a look if your interests are in data models – which as we have seen in recent weeks is coming more to the fore (and quite rightly) in data migration. I think what we need to understand is where, and with what other offerings, would they fit into a generic data migration life cycle. I'm thinking that they may find a place alongside other metadata model based products like Celona or Expressor where Rever would create the models that these other products would consume. As my understanding develops I'll let you know.

Johny Morris

About the author

John Morris has over 20 years experience in IT as a programmer, business analyst, project manager and data architect. He has spent the last 10 years working exclusively on data migration and system integration projects. John is the author of Practical Data Migration.