Legacy application modernization may mean different things to different people. But whether that means updating practices surrounding mainframes, adopting Agile and DevOps practices or updating to modern databases, legacy app modernization is necessary to keep pace with modern industry demands. 

“Legacy modernization at large means that you take advantage of enhanced operational agility and accelerated development,” said Arnal Dayaratna, research director for software development at IDC

In addition to its many different definitions, there are many different types of modernization efforts organizations go through. While some choose to rip and replace legacy systems and build new ones from the ground up, others choose to use fully-automated digital migration solutions. The important thing, however, to remember is that organizations shouldn’t be replacing legacy apps just for the sake of replacing them, Dayaratna explained. 

“Every legacy app doesn’t need to be modernized. There would need to be some kind of pain point or some kind of [issue] that modernization improves,” he said. “The best practice would be to focus energy [on] modernization efforts and initiatives [that] are contained and specific to certain goals and areas of improvement with respect to an application.” 

“There’s a lack of understanding from companies that you don’t have to rip and replace all of your applications. You don’t have to do a major overhaul with COBOL applications that have been running for years. As we say, working code is gold. And as you want to improve those applications or change them, you can do that right on the mainframe. You can do that with COBOL,” David Rizzo, VP of product engineering at Compuware, added.

According to recent research from the consulting firm Deloitte, the drivers for legacy app modernization include the high cost of maintaining legacy applications, systems and infrastructure; and a shortage of employees that are skilled in legacy languages such as COBOL and Natural.

Additional drivers include legacy applications that take too long to update functionality and teams being prevented from working on another part of the application while updating, according to Dayaratna. 

In order to address these issues, teams are trying to leverage Agile integration. However, the lack of knowledge about where to start, the steep learning curve, and the complexity end up being major challenges for organizations. In addition, there aren’t many tools yet that tackle modernization, Dayaratna added. 

“There is plenty of tooling to support developing new applications that are container-based or that are microservice-based… [but] there’s less tooling that is focused explicitly on transforming or modernizing legacy applications,” he said. 

Additionally, Dayaratna pointed out that it’s not easy to re-architect an application in a way that makes it more suitable for modern development if there is a legacy codebase and an application with say 10 million lines of code. That’s why Dayaratna explained most instances of legacy app modernization occur in an incremental fashion. 

“Let’s take a health insurance company that has a legacy app that processes claims. The claims come in and they have the patient’s date of birth, social security number, insurance identification, and different diagnosis codes that need to be processed. What the company will do is modernize that part of the application. It’s rare that an organization will transform an application wholesale. Now overtime, they may end up revamping it completely, but that will take some time. That’s a massive-gradual process,” Dayaratna said. 

Mainframe modernization is essential
Although mainframes may seem like a blast from the past, they are still front and center when it comes to business operations and remain a constant in an age of vast technological transformations. 

Even as organizations move towards microservices and cloud-native applications, monolithic, legacy, and waterfall-based applications still power significant parts of the business, and modern CIOs don’t want to remove older systems that are working well, according to Dayaratna.

According to IBM research, who has one of the most widely used mainframes (IBM Z systems), mainframes are used by 71% of Fortune 500 companies. In addition, mainframes handle 87% of all credit card transactions and 68% of the world’s IT workloads. 

“The number one challenge that virtually every enterprise has faced, or is currently addressing, is the fact that over 80% of enterprise data sits on a mainframe, while modern endpoints (phones, tablets) are not part of the mainframe model. So the actual challenge is that digital transformation initiatives must ensure that enterprise data/services be made available to modern endpoints with the digital experience that the customer expects, but with security built-in to address CSO/regulatory mandates,” Bill Oakes, head of product marketing for API management and microservices at Broadcom, and David McNierney, product marketing leader at Broadcom, wrote in an email to SD Times. 

As organizations undergo a data migration project there are many other challenges they will have to consider including the amount of downtime required to complete the migration, as well as business risks due to technical compatibility issues, data corruption, application performance problems, data loss, and cost. 

While there is still little tooling for managing and modernization mainframes, the tools that are available are making significant strides. “One of the interesting things that’s happening now in the industry is that is that many vendors are bringing modern development practices to mainframes and enabling modern application development to be performed on mainframe based apps, which was largely not possible before,” IDC’s Dayaratna said. 

Historically, developers were not able to use a browser-based IDE to develop code for the mainframe or apply autocomplete and automated debugging capabilities. Now that the tooling is catching up to modern development methods, and there will be more options for Agile development and DevOps on mainframe-based architectures, according to Dayaratna . 

“What [bringing modern practices] allows you to do is act like every other application and every other platform within the enterprise to the mainframe has the same abilities to be developed in an Agile way, using modern DevOps tools, using modern IDEs. When you do that and modernize your development environment for the mainframe that allows you to work with the other platforms and allows for every platform across the enterprise to be looked at in the same way. It can then be integrated into the same deployment pipeline so that you can move code through whichever platform is being deployed to,” said Compuware’s Rizzo. 

Legacy modernization is a challenge
“Modernization is all about scaling and being able to form the foundation for new data sources, real-time analytics and machine learning,” said Monte Zweben, CEO and co-founder of Splice Machine

While some organizations decide to take drastic measures by moving their legacy applications to the cloud, this isn’t always optimal, Zweben said. Sometimes it results in some improvements, but at a great cost. 

“The app [in the cloud] is the same application that was running beforehand on premise. In short, it’s a little bit more Agile because you can bring it up and down pretty quickly and lower some operational costs. But in the end, the cost of that application can even go up in the cloud and the application hasn’t changed,” Zweben said. “We’re at the beginning of migration and modernization. Most of my predictions for 2020 is that there’s going to be huge cloud migration and disillusionment. What I mean by that is that the first step where everyone seems to leap in cloud migration. I think they’re going to be disappointed with what they get out of that.”

Compuware’s Rizzo advised organizations to integrate with a hybrid cloud solution where some of the application’s functionality that talks to the mainframe is running in the cloud. 

These enterprises are able to benefit from all the years of investment that they’ve put into their mainframes and they also get to utilize the platform in their data center, which is the way that is  most secure and best at processing high-volume transaction processing. This way they get to keep the best of both worlds by using modern tools, Rizzo explained.  

“The good ones that are being progressive or being stewards of their platforms or their industry are looking at modernizing those applications as far as being able to integrate with the modern tool chain and being able to continue to develop them, and they’re looking at ways how they can continue to keep those applications running as efficiently as they are and continue to support their customers,” Rizzo said. “So they’re looking at the future. And that’s where it’s key that they understand that they can keep them on the mainframe.”

Other organizations are taking an extreme approach to rewriting a legacy application completely on a new platform to be able to incorporate new data sources, use machine learning models and to scale, said Zweben. 

“We think is a huge mistake because with Splice Machine, you can keep the legacy application whole. We just replace the data platform underneath it, like the foundation, and avoid the rewrite of the application. This saves enormous amounts of time and money in that migration process and actually avoid real risk because you have the quality assure much less when you’re just replacing the foundation rather than replacing the whole application,” said Zweben.

Managing the technology to work in sync has also been a struggle for organizations. 

“You had to bring operational data platforms to the table, either NoSQL systems or relational databases. You would have to bring analytics engine. Typically they’re either data warehouses or Hadoop -based compute engines, SPARC based-compute engines and you bring machine learning algorithms into it. And this takes a lot of heavy lifting a distributed system engineering,” Zweben said. “And that’s hard. This is really hard stuff to configure and make work and operate.”

By using automated migration tools, companies will be able to migrate to any cloud way easier and open up new operating entities anywhere in the world on a dime,” according to Zweben. 

API management is also an essential core component to digital transformation across every horizontal. 

“Digital initiatives based on APIs are all about providing scalable, reliable connectivity between data, people, apps and devices. To support this mission, experienced architects look for API management to help them solve the challenge of integrating systems, adapting services, orchestrating data and rapidly creating modern, enterprise-scale REST APIs from different sources,” Oakes and McNierney said. 

Another problem that needs to be solved is that there are groups in organizations working in silos. 

Zweben gave the example that when an organization wants to implement AI into their applications, they’ll typically have an AI/ML specialist group working off to the side and in this case the key is to put AI people into every team instead. 

“Every enterprise is dealing with ongoing digital disruption and transformation in order to remain competitive in their market – regardless of the market,” Oakes and McNierney wrote. “Mainframe organizations understand the need to make the platform less of a silo and more like other platforms so they have modernization-in-place initiatives underway. Even careers mainframers, who have built and maintained the mainframe-based systems of record that are the lifeblood of their organizations understand this need.”