Subscribe
About

Data migration: where to begin?

Key to the success of any data migration project is deciding which data to migrate and which to store.

Barry MacDougall, Service Delivery Director and Data Migration Expert, JMR Software.
Barry MacDougall, Service Delivery Director and Data Migration Expert, JMR Software.

Bloor Research puts the failure rate of data migration projects at 38%, while Gartner says more than 50% of data migration projects will exceed budget and/or result in some form of business disruption owing to flawed execution. Data migration is a minefield, but following a few simple rules can help businesses emerge triumphant at the other side.

The biggest failing, says Barry MacDougall, Service Delivery Director and Data Migration Expert at JMR Software, is for businesses to want to migrate all of their data instead of just that which is necessary, thereby confusing archiving with data migration.

This pitfall can be avoided if businesses adhere to the very first rule of any data migration project, which entails deciding which data should be migrated onto the new system and which data can be stored or archived. A well-executed mapping process has to be done at the very outset, explains MacDougall, if the project is to be a success.

He says there are basic criteria that dictate which data should be migrated, such as this: Will that data play a key role in core business processes going forward? It's important to sit down with subject matter experts and ask them what data is required to make the new system work. Another determinant would be: Do you need to retain that data in order to comply with regulatory requirements such as POPI, King 3 and 4, or even GDPR?

However, he cautions that the more data you migrate, the more risk you introduce into your migration. "If you can limit the scope of data needing to be migrated, you are, by implication, reducing the risk in the migration process."

Plot and plan

When it comes to determining which data should reside where, MacDougall says the system and database architecture on the new target system will dictate where the various types of data are stored. "Typically, we'd examine the database structure on both the target and source systems, and map accordingly. Look at the business process you need to drive and do a mapping, although it's not always a one-to-one mapping. This dictates where the data gets stored."

The business is ultimately responsible for deciding which data to migrate, as the custodians of the data. IT is simply an enabler in coming up with a solution. The business must allocate experts who are authorised to make decisions on behalf of the business and IT.

MacDougall says it's important for business and IT to agree on measurable success criteria in terms of what constitutes a successful migration. However, as stated previously, the business should be wary of trying to migrate all of its data as this will just complicate and delay the process, as well as increase the time and cost elements of the project.

Asked what can go wrong with data migration, MacDougall says the temptation is too often to underestimate the complexity of the data mapping process and not consider the data quality until the end of the implementation project.

He says: "The value of any application resides, to a large extent resides, within the quality of the data in it. Modelling and mapping the data can, in itself, cause issues with the data, which is why it's so important to do this upfront before migration development begins. You can introduce changes down the line, but they will need to be controlled and measurable."

He points out again that it isn't always necessary to migrate all of a business's data. "Not only does attempting to migrate all of your data increase both risk and cost of the migration project, it's simply not necessary. During the mapping process, business subject matter experts should critically examine the data and decide what's needed to carry business processes going forward on the new system. You have to ask yourself, at a granular level, do you need that data? The business has to realise that it's not necessary to take every single iota of data into the new system."

He illustrates this point as follows: "If you consider an insurance policy, for example: On a daily basis there will be movement on the policy as rates fluctuate; all of these unit movements are recorded as individual transactions on a daily basis. On an older policy, you might only need the unit movements for the past seven years in order to comply with regulatory requirements. Which means that you don't need to migrate all of the data around that policy at its lowest level of detailed granularity.

"Businesses should also be cognisant of the fact that older data costs more to migrate, as well as increasing the risk element."

The only way to retain data that's not suitable for migration is to archive it, but MacDougall says that not all businesses have an archiving solution, which is why they try to include all of their data in a data migration project. "However, when it comes down to the key question of what data is required to drive your business processes going forward, data that ought to be archived won't qualify for migration. There's no point in cluttering the new system with data that ought to be archived. Yet data that isn't needed on the new target system can't just be disposed of, it needs to be stored somewhere. The business's archiving strategy has a clear impact on data migration, even though most businesses might not think so at the outset."

Migration essentials

One of the most important things when doing the migration, according to MacDougall, is to ensure that you trigger the business logic when storing the new data in the target system. "You can't bypass the business logic layers because the data that you store there might look fine, but as soon as you start trying to process it, applications won't work properly because the data might not comply with the necessary requirements. It's best to use an application program interface (API) that will trigger the business logic and make sure all the data is accessible and available as and when needed. All of the data must go through a validation process via your business logic layer; you can't just copy data directly onto the database."

He's a firm advocate of automating as much of the migration as possible, with the least possible degree of human intervention, in order to ensure consistency and limit potential errors being introduced into what is a very complex process. This applies to job submissions, sequencing and error checking as automating as much as possible will yield greater consistency.

Finally, MacDougall emphasises the need to test at every stage in the process, as well as outlining the importance of carrying out a full dry run, where you run the full suite of batch jobs against the migrated data to highlight any deficiency.

"Testing is not just about eyeballing the data and saying that it looks okay. You must be able to transact against that data both online and as a batch. You need a comprehensive dry run before you can say it's successful and go live. This may sound basic and rudimentary, and you wonder why people don't do it, but often people don't do batch testing.

"If you adhere to the basics of planning ahead, only taking what's needed, automating as much as possible and testing throughout, going live should be a mere formality," he concludes.

Share