Whether it’s a merger, acquisition or new software platform, the project work stream which poses the most risk to a successful implementation is data conversion. Conversions from legacy systems to new state-of-the-art database driven platforms have high failure rates for a variety of reasons, but mainly because source system data is rarely well documented, source system data quality is aged and unreliable, and the complexity of the effort is underappreciated, all leading to unreasonable delivery timelines and expectations.
Adding to the frustration is the fact that almost every other work stream – interfaces, workflows, queues, functional configuration, reporting – require a sufficient level of quality data not only to begin unit and user acceptance testing, but to complete full integration and End to End testing. Project coordination and delivery proves difficult if you have no confidence in when your data conversion will be in a ready state. Your project reflects a tumbleweed being blown back and forth by the wind.
Here are four easy to manage steps which can be taken to improve the likelihood of data conversion success and minimize negative downstream impacts:
- Assign SMEs who understand your source data fields and can accurately document the following:
- Data and fields needed to meet functional requirements.
- Data and fields needed for operational and management reporting.
- Data and fields needed for interfaces and integration points.
- Data and fields needed for workflows and queuing.
- Data and fields that do not need to be migrated to the new platform.
- Create a data mapping bible. This document should include the following:
- Every source system data field to be converted to the new platform.
- The configuration element, interface, report or workflow associated with each data field.
- The new system field mapping.
- Once your mapping bible is ready, schedule a review with your line of business to get confirmation that mapping is as expected and nothing of value has been omitted.
- You are now ready to begin data conversion coding.
- Do not migrate any data that you don’t need. If you don’t have to board it onto the new system, don’t do it. Old data and closed accounts can prove to be difficult and costly to board correctly onto new systems as they almost always involve too many edge cases. Typically, seven years of history is sufficient. Seek other alternatives for maintaining aged and closed accounts, or migrate those accounts during a post-production, Phase 2 approach.
- Implement a phased data conversion strategy. Start with a small subset of active accounts, approximately 20% of your accounts, but covering all aspects of your portfolio. The goal is to perfect the conversion code as quickly as possible. Starting with a smaller subset will help you identify and remediate defects more quickly through an increased number of conversion and testing cycles. The resulting code and mapping updates will resolve issues across the entire conversion portfolio. When data quality reaches an acceptable level, continue the process by adding additional subsets of active accounts. Once active accounts achieve pre-determined and acceptable data quality levels, then and only then, begin boarding closed accounts.
In most instances, testing for most, if not all, other work streams can begin once data quality standards are achieved with the initial subset. As more accurate data is available, provide it to your test and development teams to enhance their testing scenarios.
In summary, detailed source system documentation and a phased data migration approach will improve your data conversion and project experience.