The first Maserati was introduced in 1926. The first Ferrari was introduced in 1947. And the first Porsche was introduced in 1948. And my personal favorite, the first Land Rover, also was introduced in 1948.

What do each of these legendary cars have in common? 

Each predates the mainframe and COBOL, yet no one calls them outdated. Why? Because they have continually evolved—embracing modern engineering, cutting-edge technology, and innovation while maintaining the efficiency, performance, reliability, and excellence they were built on. The same is true for the mainframe.

Yet, despite decades of continuous transformation, some critics still cling to the myth that mainframes are outdated, inefficient, and unable to integrate with modern IT systems. This couldn’t be further from the truth. IBM’s z16, introduced in 2023, was built for AI, and the z17, due to launch this year,  is poised to handle new workloads with unparalleled security, scalability, and efficiency. COBOL, the backbone of critical applications, is as easy to use as any modern programming language when paired with the right tools.

The problem isn’t the mainframe—it’s how we’ve managed and transformed the applications running on it. Instead of walking away from the most reliable, secure, and high-performing computing platform in history, we should focus on how it’s evolving to support new workloads, AI-driven insights, and hybrid cloud integration.

A Rapidly Modernizing Space

The mainframe isn’t standing still. It’s taking on more mission-critical workloads than ever, supporting everything from AI-powered fraud detection to high-speed financial transactions. In fact, a whopping 72 percent of the world’s compute runs on mainframes while the platform makes up just 8 percent of IT costs.

Mainframe transformation involves two things. First, development teams need to harness mainframes’ computing power, scale, and data storage capabilities. Second, they need those mainframe systems to align with the automation capabilities that their cousins in the cloud have adopted, making the mainframe software development life cycle more efficient, eliminating manual processes, and increasing the quality and velocity of legacy applications. 

DevOps workflows alone won’t get us there, but tools are bridging the gap. 

When it comes to tools, shops need mainframe code to be managed just like cloud or distributed applications, enabling continuous integration/continuous development pipelines, automated testing, and version control while maintaining compatibility with legacy environments.

Culture and the developer experience also play an important role in mainframe transformation. If the developer experience for engineers is subpar, a boost to efficiency is unlikely to emerge. Removing manual bottlenecks, reducing or eliminating context switching, streamlining archaic development processes, and adopting an agile culture are all easy ways to improve the developer experience.

Fine-Tuning the Mainframe for Government Efficiency

Customers I talk to often describe three very different—but equally valid—paths for fine-tuning their mainframe strategy. Some government agencies choose a slow-and-steady approach, expanding their mainframe footprint over time as needs evolve. “Our workloads are growing as our population grows,” one CIO told me. “We’re not moving off the mainframe—we’re growing with it.” For these agencies, there’s a natural rhythm of growth that doesn’t require radical change, just thoughtful investment as usage expands.

Others are leaning into modernization by refactoring the code itself. With the help of Generative AI-powered code assistants, customers are telling me they’re finally able to tackle decades-old applications with confidence. These tools explain unfamiliar code in plain language, document it automatically, and suggest best practices for making changes. For government teams with limited access to senior mainframe developers, this new level of code intelligence is helping bridge the skills gap and enabling faster, safer transformation of core applications.

And then there are the agencies doubling down—reinvesting in the mainframe by upgrading to the latest zSystems and embracing DevOps practices across the board. “If we can do it on the distributed side, we should be able to do it on the mainframe,” one agency leader told me. By staying current, these organizations reduce technical debt, support modern development tools, and ensure seamless integration into their enterprise-wide DevOps workflows.

Future-Proofing the Mainframe

The developers working with mainframes are also excited about their future. A 2024 Forrester Report found that “among global infrastructure hardware decision-makers, 61% said that their firm uses a mainframe. Of those that use mainframes, 54% indicated that their organization would increase its use of a mainframe over the next two years.”          

There’s also a wide ecosystem of vendors building tools to modernize the mainframes. 

That is why you see more and more talk about artificial intelligence, graphical scanning, and mapping tools to parse, map, and refactor legacy code bases and monolithic code into more manageable assets. AI also gives organizations the ability to quickly onboard new resources and get them familiar with their code base faster to become more productive. Developers can pinpoint necessary changes faster, reducing planning time and accelerating updates.

These trends are promising, and I do not doubt that they would allow government services to harness the mainframe’s data storage and processing power while also adopting the agility that has been the hallmark of Silicon Valley.