Legacy modernization is an area that is constantly evolving and changing over the past three decades. The drivers of legacy modernization have also changed. In the 1990s and early 2000s, one of the key drivers was cost, followed by the availability of skilled manpower. Mainframe technology was significantly more expensive than UNIX or Windows-based systems for small and medium workloads. The mainframe was somewhat similar to a bus that can carry a large number of people over a long distance, but lacked the agility of a car that can carry only a few people.
After 2010, the cost factor decreased to some extent as the costs of mainframe MIPS and storage came down significantly. However, by that time, many large financial institutions had become a spider's web of applications and technologies. During the 1970s and 1980s, there were a large number of case/reporting tools such as DeltaCobol, MARK4, Telon, Focus, Nomad, and many others found in mainframe environments, along with network (e.g., IDMS) and hierarchical (e.g., IMS) databases. As the dominant mainframe language was COBOL with DB2 (relational database), many large organizations started modernization projects for rationalization and decommissioning of the application landscape. Smaller mainframe shops continued with efforts to get off the mainframe by reengineering or migration.
It is important to distinguish between reengineering and migration. Migration aims to deliver a functional clone using modern technologies of the legacy application. A number of vendors have tools for translation of legacy code to modern languages such as Java. In many cases, the architecture/structure of the generated code resembled that of the legacy applications. Manual tuning is required for performance in some cases. However, the low costs and risks associated with migration tools made them an attractive option for non-strategic applications. Another migration approach is re-hosting. In this approach, the mainframe workload is migrated to Linux/Intel-based mainframe emulation environments that offer platform and license savings. COBOL-based mainframe workloads have multiple options for re-hosting.
In reengineering, also called redevelopment, the legacy application is redeveloped using the latest and greatest technologies and architecture. The legacy application is referenced to provide the functional base for the application and data migration. This is normally the most expensive and is recommended for strategic applications that are part of the long-term future for the organization. Unlike the migration approach, major strategic functional enhancements can also be incorporated. Business requirements are derived using a combination of business user meetings and selective code mining for specific details and calculations (e.g., underwriting, tax algorithms, and calculations).
Another approach that is common nowadays is to move from customized applications to a package. This is more or less the trend in domains such as ERP, accounting, and insurance policy administration. Package vendors normally provide annual updates for compliance and statutory reporting. However, customization of the package is a major activity. Modern packages require a high degree of configuration. The equivalent parameters in the legacy apps are in many cases buried in the code and have to be extracted out. Code mining tools can help in this regard.
Another approach is a surround strategy that is comparatively low risk and cost. This approach is to develop a wrapper using ESB or connector technologies on the legacy apps. With today's connector technologies, it is possible to have Java-based front-ends run COBOL business logic programs on the mainframe. This gives major business and user training benefits from moving from green screens to digital UI across multiple channels.
The ten-thousand-pound gorilla that is present in all the above options (except surround strategy) is data migration and testing. Historically, more than 60% of legacy modernization projects have failed or had significant overruns because of data migration. For this reason, we propose developing a data migration suite at the start of the project. The data migration suite is used to generate the development database from the legacy database that is used by the new target development team. Any data migration issues will thus get identified and resolved right in the beginning rather than towards the end. Testing strategies and environments based on the principles of regression testing should also be defined at the start, together with the data migration strategy.
The implementation strategy is also a major driver for project success. A phased implementation is lower risk. However, it will require a lot of workshops for impact on development, data migration, development of throwaway bridges & interfaces, decommissioning calendar, etc.
Given the multitude of options, an enterprise-level modernization assessment is absolutely vital before starting off a modernization program. The process followed for a large enterprise-level modernization assessment for an insurance client is given below. For each application, the potential application outcome was derived with a high-level solution approach, implementation strategy, costs, timelines, risks, dependencies, along with a business case for the same.
Depending on the client’s overall cloud strategy, Cloud compatibility is also an important aspect in the assessment. Most of the solutions discussed above are Cloud compatible with the only possible exception being migration to vendor driven COTS (Common off- the- shelf) packages. We should check with vendor with respect to Cloud Compatibility of package.
Conclusion
Legacy modernization is a critical process that can help organizations overcome the limitations of outdated technology systems and better position themselves for success in today's rapidly evolving business landscape. There are multiple approaches depending on business need, criticality and budget. By following a structured approach that includes assessment, planning, design, development, testing, deployment, and maintenance, organizations can successfully modernize their legacy systems and achieve their business, agility and strategic objectives.
Coforge has a dedicated Centre of Excellence staffed with experienced practitioners of legacy modernization. Coforge offerings around Legacy Modernization are as follows.
Coforge will welcome an opportunity to help you to rediscover and transform your legacy. Please fill up the form https://www.coforge.com/quick-connect so that the Coforge local office for your region can contact you for same
Pulok is part of the Enterprise Architecture group at Coforge. He looks after development of legacy modernization offerings, strategies and tools for Coforge clients across the organization. He has over 30 years’ of experience working on legacy and newer platforms. Besides legacy platforms Pulok also has experience working on Linux, C and JAVA technologies. On the legacy side Pulok has worked on a wide range of languages, databases and technologies on IBM z/OS, IBM TPF, iSeries, Tandem and UNISYS platforms. Pulok is also responsible for evaluation and development of relationships with modernization tool vendors for global clients. He has worked in insurance, banking and travel
Related reads.
About Coforge.
We are a global digital services and solutions provider, who leverage emerging technologies and deep domain expertise to deliver real-world business impact for our clients. A focus on very select industries, a detailed understanding of the underlying processes of those industries, and partnerships with leading platforms provide us with a distinct perspective. We lead with our product engineering approach and leverage Cloud, Data, Integration, and Automation technologies to transform client businesses into intelligent, high-growth enterprises. Our proprietary platforms power critical business processes across our core verticals. We are located in 21 countries with 26 delivery centers across nine countries.