According to Gartner, 80% of data migration projects fail to meet expectations, running over time and budget.
Why do so many data migration projects end in failure?
Financial Institutes including Banks are faced with the challenge of rising consumer expectations, mergers & acquisitions, continued digital disruption and intense competition from Fintechs like never before. With the whole new focus on implementation of newer systems and processes, banks are introduced to newer data sources and diverse formats. Data Migration in financial institutes most commonly involves transferring of data from legacy systems to newer ones which is often easier said than done.
It is one of the key activities for any system implementation or upgrade that entails significant amounts of risk. Migration is almost never straightforward, and there are very few situations in which you can really validate everything. Validation of migrated data is always an assessment and a balancing act between enterprise risks versus business priorities.
Here are some of the most common data migration scenarios:
- Transferring data from one system to another while changing the storage, database or upgrading to a higher version of the application.
- Data Migration after a merger or acquisition (M&A)
To help you get started with your data migration validation we have put together some tips and actionable insights.
6 Keys to get Data Migration Testing Right
1. Data Quality at Source: To optimize risk efforts banks must take charge of data quality at source ensuring its accuracy, completeness and relevance. Data inaccuracies that go unattended can not only cost banks irreversible loss of reputation but also put them at a regulatory risk. This process begins with defining and establishing appropriate data standards and processes for measuring, monitoring and reporting conformance to acceptable levels of data quality.
2. Functional Testing with Migrated Data (For a system upgrade) : Going by the principle of testing, an application must be tested based on how it will be consumed by end users and not the way it was built. While Functional testing helps validate the new system ensuring it meets the business objectives and needs it is conducted on simulated data in most cases. It's entirely possible to have passed all the tests only to find out that some of the key functionalities are lost with migrated data or they don't work 'well enough'. So it's imperative to perform functional testing on migrated data at some level.
3. Automated Validation of Data on UI: It’s hard to identify data combinations that could distort the UI as validating sizable records manually is not feasible. This is where automated validation comes to play. The use of functional automation tools like UFT, Selenium, + others can help you save time and improve coverage of verified data. Moreover the automation of data migration brings with it a high ROI on each script as it is executed multiple times. Hence implementing automated validation is a no brainer here.
4. Exhaustive Data Profiling right from the start
- Identify classes
- Test design techniques
- Combinations cause problems
- Cover all record sets-classes
5. Balancing Enterprise Risk versus Business Priorities: From a business perspective testing is all about achieving operational excellence, debunking the most common misconception that all defects need to be fixed. Some may have workaround or can be handled via new transformation scripts while some can be ignored. While deciding what to fix, it’s important to never lose sight of its impact on business.
6. Migration Waves in Rapid Succession: While balancing of data volume in a wave and not just the number of customers can help minimize the risks and also mitigate any issues that may lead to customer dissatisfaction. A wave list may typically go through 4 to 5 iterations depending on the complexity of parameters. Here are some of the criteria’s that help establish migration waves
- Migration requirements / requests of integrating application partners like PeP+, Payplus/ Wires gateway, and ARP SMS.
- Timelines of other projects and their migration testing plans.
- Migration mechanism of interfacing applications
- Balancing customers of various groups like government, VIP customers, business customers etc.
- Location of customers to evenly distribute across geo-territories. In some cases global customers may have to be grouped together for language support.
7. Verifying the Data Migration Mapping Document: Before you get started make sure you have the data migration mapping document verified for accuracy and complete coverage. It’s an integral piece of your migration process helping you mitigate your risk efforts. Take a look at our checklist to make sure nothing derails your data migration plan.
Here are some more tips to help you validate your data migration program.
- Building Validations around Data Transformations: Before you begin to think of data migration you need to start identifying the type of transformations required and build test cases for each transformation. Quick Tip: It’s a good practice to automate test cases executed with different sets of data.
- Proof of Concepts (PoCs) : The best way to get your migration to speed is to perform POC testing covering all entities and different classes of transformation in place of one entity covering all validations.
- Periodic Assessments: A key element of the data migration journey – from business case to actual implementation is risk assessment and mitigation. As a product goes through migration waves, risk assessments and reviews will have to be regular and current.
The Four Pronged Approach
As Data and User migration are infinitely more complex than functionality or usability it requires a four-pronged approach. Based on which you can identify your customer conversion risks up front, in any situation where data is being moved from one or multiple products to another or one database to another. The Four Pronged Approach helps to strengthen the planning, analysis and implementation of data migration programs leading to better risk reduction, data accuracies and preparedness. Here’s a quick overview to each of the approaches:
Stage 1 - Data Intelligence
- Product based identification of entities and data-sets
- Profiling by grouping, sub-grouping and clustering
- Checksum based reconciliation
Stage 2 - Machine Recon
- Automated data-loading and shifting
- Complex condition induced code for efficient comparison
- Recursive learning scripts for increasing result efficiency
Stage 3 - Supervised Feedback
- Supervised review of results to remove false-negatives
- Knowledge-based data sampling to identify anomalies
- Customer impact enabled defect analysis
Stage 4 - Experience based Testing
- Rich experience-based tests to calibrate data usage
- Simulating live customer interaction on migrated data
- Infuse confidence to migrate in Live
Go-Live Faster’s Data Readiness platform is paving the way to making financial institutes Data Ready. If you’d like to discuss further and learn how Go-Live Faster can help you get “Data Ready” and alleviate migration risks in any situation, please drop us an email on email@example.com. Take a look at our success stories to find out more on how we help our clients accelerate their online banking implementations.