Have ENTOTA plugged a major hole in our migration tool sets? This week I’ve been talking to Clive Bellmore of ENTOTA about their ENTOTA Data Migration Portal product for SAP migrations and the lessons it could teach us about enterprise application implementation in general.

First a bit of background on Clive. This is a guy who really knows his data onions. As the Managing Consultant at Acta Technology when it was acquired by Business Objects he went on to be the head of data services at SAP itself when BO itself was a bought up by our friends from Walldorf. There he oversaw the extension of the SAP best practice solutions for data migration. However as a serial offender in the data services space he recognised a gap in the integration of best practice and software and left to set up ENTOTA to create a solution this problem.

All of which brings us on to the portal. Here let us take a step or two back. Those of you who are familiar with the Practical Data Migration methodology will be aware that within the Migration Strategy and Governance module we have a checklist. And on that checklist we ask about configuration management and change control.

This is because one of the more obvious features of an ill structured migration project is where data, which previously loaded just fine, is failing to load now because someone has changed something on the target without adequately communicating this to the migration team. This is one of the major consumers of project time and budget that the ENTOTA Data Migration Portal is aimed at fixing.

Secondly in most commentaries on data migration it is recommended that there is a de-linking of target system design and data migration design. Two work streams or projects within one programme of work is the normal recommendation. The reasons for this are twofold. Firstly we should be designing for the future not replicating the past.

A clear hand off between the two streams aids this. Secondly, we lack the tools to easily link data usage within the target system design with the ETL activity within the data migration work stream. And this is a problem that the ENTOTA Data Migration Portal addresses.

A third issue is one of data design. Those of us who have worked on a number of data migrations will be familiar with the situation where the team designing the target do not specify the detail of some of the key entities.

For instance I was working in a telecoms environment where it was left to the data migration team to populate the code tables that defined the build of individual products even though this affected the business processes that were needed to create new instances of different types of product. With thousands of interrelated products to define and build on the target the necessary late starting of this task due to it being with the migration team caused a significant delay to the project.

Even where the responsibility for this detailed design is clear there is still a problem of ensuring that the design matches the business need and that the migration matches the design.

This is what ENTOTA and Clive mean by data design and it is what the ENTOTA Data Migration Portal is all about.

So on to the ENTOTA Data Migration Portal itself. The portal incorporates SAP data services. This provides it with ETL, data quality and data profiling capabilities. The portal also instantiates pre-built models of SAP Objects as the starting point for collaboration between the SAP Functional Experts and client Business Domain Experts to work on both the use of SAP in a particular environment and the mapping of legacy to target at the same time.

Their SAP design is translated directly into the ETL and, it is claimed, can provide 80 to 90 per cent of the technical build. The remainder can be completed by SAP technical experts also within the ENTOTA Data Migration Portal using the embedded SAP Data Services.

It is easy to see how this solves the problem of breakdowns in the three way communication between SAP functional designers, business domain experts and SAP technical experts. It also provides out of the box release management and project reporting - how many objects and fields are designed and mapped, how many more to go, which release are they in etc. It is claimed that this also greatly reduces reconciliation time - although I didn’t really follow this up in our conversation so I’m not sure whether this is at the individual record occurrence or the field level.

The collaborative nature of the tool allows the team of business and technical specialists to see what the data will look like before it is moved allowing a prototyping of data design prior to the expense of actually building the solution. This is an approach not unlike that offered by the Pandora product I discussed recently but in this case absolutely embedded in the SAP migration tool set.

My initial assessment of this portal concept is strongly favourable with only the following caveat - how do we prevent backward looking design? It will take strong project discipline to push for changing current practices to best take advantages of SAP’s strengths over the needless replication of current data structures.

This is of course more a training and procedural issue than a software one. One of the strengths of PDM is that it focuses on legacy decommissioning and so on archive and data retention strategies to get some realism into how much legacy data needs to be retained in the target. This also creates the discussion with data owners around issues of data lineage, windows of opportunity etc. It will be interesting to see how far ENTOTA’s best practice recommendations replicate this.

Where then is the DMZ boundary in the use of this product in a standard migration setting? It will perform the profiling aspects of landscape analysis. It has a built in migration model and all the features needed for gap analysis and mapping of the target - but not necessarily of the archive solution.

It handles migration design and execution in spades. It does not touch legacy decommissioning but then again few software tools in this space do. It provides a great and natural business engagement forum that overcomes one of the main drawbacks of the two work streams model - the unfortunate end client getting visited twice by a bunch of techie guys who seem to be asking the same questions. I can see it being ideal in an agile as much as a waterfall setting. It solves a lot of problems around project management and configuration management in the fast flowing environment that is a typical system implementation.

I would be interested to see how ENTOTA’s recommended best practice around the use of this tool closes off the softer issues that PDM caters for in its System Retirement Plans and especially in its Data Quality Rules process. I really think the portal concept is one that is applicable beyond the boundaries of the SAP universe.

A PDMv2 wrap around this product would of course provide the solution to any gaps - but perhaps I’ll return to that in a future blog once I have had a hands-on look at the ENTOTA Data Migration Portal and surrounding processes.

As usual, if any of the terms within this blog confound you - well you can always buy the book ‘Practical Data Migration - Second Edition’ available from this site or, for a taster, drop me a line at the email address below for a copy of the free A1 sized PDMv2 wall chart.

Johny Morris

About the author

John Morris has over 20 years experience in IT as a programmer, business analyst, project manager and data architect. He has spent the last 10 years working exclusively on data migration and system integration projects. John is the author of Practical Data Migration.