Subscribe
  • Home
  • /
  • Storage
  • /
  • How to mitigate risks of large-scale data migration projects

How to mitigate risks of large-scale data migration projects

Large-scale data migrations can be complex and invariably need extensive, concise planning and execution, with best practices, to ensure success.
Windsor Gumede
By Windsor Gumede, Director, PBT Innovation at PBT Group
Johannesburg, 29 Mar 2021

A data migration is an outcome of evolution, innovation, and/or renovation in the IT space, within an enterprise. With data being one of the atomic components of systems and applications, we cannot neglect the migration of data when doing system, application, or infrastructure migrations.

Data migration is the process of moving data from one storage unit to another. As simple as this may sound, large-scale data migrations can be very complex and need extensive, concise planning and execution to ensure success.

Therefore, before embarking on a migration journey, it is important to understand the reasons for migration from one system, application or platform to another. There are various reasons why migrations occur within an organisation. These can include:

  • Optimisation of platforms/systems for operational efficiency and the reduction of operational costs.
  • Digital transformation of an enterprise to becoming more competitive, which warrants new platforms that have cutting-edge functionality to be introduced as part of the enterprise architecture.
  • A move to new infrastructure where the old infrastructure is no longer scalable.

No matter the reason for the data migration, it must always be geared up to providing a significant level of business value and must align with business objectives in order to be supported by stakeholders outside of the IT department.

In my experience, business buy-in is critical to the initiation and execution of a data migration project − irrespective of whether the project is at the level of data, application, or infrastructure.

IT systems are meant to serve the business. Implementing a fundamental change in the organisation without considering business buy-in could be detrimental to the success of the project. And one of the key success factors of a data migration project is ensuring minimal to no interruptions to business operations.

Data migrations in the context of data and analytics platforms can be challenging and risky. The phrase “the only constant in life is change” comes to mind and aptly also applies to the IT world and particularly in this regard.

Over and above stakeholder buy-in, understanding these associated risks and determining what the best practices are for data migration becomes critical to getting this process right.

We know that data is a delicate asset that should be handled with care.

We know that data is a delicate asset that should be handled with care. Data migrations can be risky as they can result in the loss of data. Incomplete data in the target system means the migration was not done correctly if the scope is to migrate all data.

Data can also be corrupted or unusable if migrated incorrectly. Corrupt data means data assets need to be repopulated in the correct way, as this can create system or application irregularities and will certainly impact business operations. The target system could become unstable and cause unwanted downtime to business operations.

Data migration best practices

There are, however, a number of best practice guidelines that can be followed, to help reduce the risk of failure when delivering a data migration project. These include:

Create a migration plan that breaks down the migration steps in a logical way. Decide on a logical grouping of data assets and plan to move them in a systematic order. One can also orchestrate the steps in order of importance while observing complexity. The migration plan should be clear and detailed enough for implementers to understand. It must contain the resources that will be responsible for the different components of the data migration. It is critical to ensure there is adequate skill in the dedicated team to perform the data migration.

Understand the data. For any journey taken in life, one must first understand where they are before they can plan how to get to the desired destination. This applies to a data migration journey as well. Understanding the data, where it is located (schemas, tables, data files, etc) as well as what is contained in those data assets is critical to successful migration of data assets from the legacy system to the new system.

Before taking further steps, to perform the actual migration, take a backup of the all the data from the system where the data will be migrated from. A system restore can be performed if something goes wrong with the migration from the source system.

Create a prototype, with supporting test cases which will serve as a proof of concept that proves that data can be transferred to the new system without compromising on data quality and integrity. This will ensure all connections from the legacy system to the target system are in working order. It will also allow one to measure the connection speeds which can be used to estimate the total time of migration.

Use the right tools for the job. An automated migration is more efficient than a manual data migration. It is important, however, to select the correct tools that will ensure an automated data migration to alleviate the risk of data corruption and compromising the integrity or quality of data as mentioned earlier.

Create test plans for the three phases of the data migration project. It is important to understand the state of data prior to the migration, therefore testing before the start of the migration is necessary. This is called pre-migration testing. As data moves from the legacy system to the target system, the data needs to be tested. This testing is to ensure the migration is going as planned, and the team can decide to pause the migration if any inconsistencies or issues arise. This gives the migration team a chance to review the status of the migration, determining if the issue is a showstopper, or if it can be fixed to proceed with the migration. The last phase is post-migration testing; this testing should occur once all the data has been migrated to the target system.

If the cost of running both the legacy system and the new system is financially viable, it is recommended that a gradual transition takes place by running both systems in parallel for a short period of time so that operational issues can be identified prior to the switchover. The inclusion of business is key to this phase as users would be able to report issues when trying to use the new system. If this approach does not suit the data migration project strategy of the business, consider moving certain modules into the new system and switching off only those migrated modules in the legacy system until all functionality is migrated and tested adequately on the new system.

Lastly, an agile mindset can also assist in mitigating the risk of large-scale data migrations as the team can compartmentalise data assets to migrate based on business value and including business in every step of the way. This in turn creates trust and buy-in from a business point view.

With the right planning and business support, the risks associated with large-scale data migration can be effectively managed to ensure a streamlined data migration process. The above guideline can be used to support this process and achieve data migration success. 

Share