DevOps and mainframes are like peanut butter and chocolate—each great on its own, but both better together. When organizations move to DevOps, they often struggle with integrating mainframe applications into the new process. Too often, they expect to need to go through all sorts of gyrations to incorporate the mainframe applications and data. But there's a better way.
It starts with cultural change. Organizations are often bifurcated into the team supporting traditional core applications on the mainframe and the team supporting everything else. The team for everything else—cloud, distributed applications, front-end user interfaces, the website—tends to be much more integrated into broader modernization efforts.
When an organization moves to DevOps, that is usually led by the non-mainframe team, with little involvement from the team managing the mainframe-based core business applications. Years of assumptions have led many to believe that the mainframe cannot participate in those processes.
I find that to be a very interesting cultural phenomenon. When I do guest lectures about unconscious bias and diversity in IT, I make the point that we have an unconscious bias when it comes to mainframes and COBOL. We usually don't consider or challenge our assumptions that those applications cannot participate in this new world and the only way forward is to eliminate them.
But Sudhanshu Dubey, application development associate at Accenture and a former intern with the Open Mainframe Project, has proved that it takes minimal effort to update those applications. In three months, Dubey took a monolithic COBOL banking application and refactored it to allow new user interfaces and integration into cloud services.
He used a process to expose the APIs and microservices. Although Dubey's project was not specific to DevOps, it did lay the foundation for DevOps, agile practices, and all the rapid innovation that the business desired.
As Dubey has shown, you can do this in four steps. These are the same four steps you would use to modernize an application written in any programming language. Here's the process.
1. Analyze the COBOL application
Core COBOL applications, written many years ago, were often created with now-outdated programming practices and are poorly documented. Sometimes these applications were developed before industry-standard programming practices were even defined.
In addition, developers had to work around historical hardware and software restrictions. The resulting code is often referred to as spaghetti code. And it's monolithic—a single application that handles multiple tasks.
In the case of the banking application Dubey modernized, one program updates all customer records. When customers transfer money in, when they transfer money out, or when they check their balance, it goes through this one giant application. Because you have this huge monolith of code, you must update and manage the entire application, even if you're changing just one function.
In the world of microservices, all of these functions would be separate applications, allowing you to innovate a single function—money transfers, for example—without having to change or test the entire application. Having fewer, isolated changes to test makes DevOps processes easier and faster.
The first step is to understand what's happening inside that monolithic application, and that's where "analyze your COBOL application" comes in. Today, software products are available to go through a COBOL application and help you map out where the variables are being used, what functions are happening, and where those interfaces are.
Once you have that map, you can move on to the next step.
2. Identify services candidates
In this step, you identify the area of code that you would like to turn into a microservice. The map will tell you what other functions in the application will be affected when you move the desired microservice code out of the monolith.
Once you've identified these service candidates and you've looked at the effect of moving that code out, you can build your plan. Then it comes down to isolating and exposing those services. And this is where programming is involved.
3. Isolate and expose services
With a giant monolith of code, you are relying on if-then-else statements (and sometimes goto statements) to navigate through to the specific function you need. When isolating and exposing services, you are taking the steps for specific functions out of the monolith so they can be called from the master program when needed.
The resulting programs are smaller, easier to understand, and easier to test and deploy in a DevOps environment.
The actual code needs to be separated and packaged into separate programs for deployment. That involves work, since you must slice the code for the service out of the monolith and then update the monolith to call the new service via new APIs. Although programming effort is involved, it is certainly less effort than rewriting the entire application. This is the point of this method: less work and less risk for the same result.
4. Connect microservices to a new, modern UI
In Dubey's case, the last step was creating a new modern interface. He was able to implement a new front end more easily by accessing the separate services via APIs. Many people assume that COBOL code doesn't integrate well with new languages such as .NET or newer cloud services. But it does.
When that COBOL code is broken into microservices with documented APIs, it looks like any other applications from the outside. An API is an API; it doesn't matter if the code behind it is COBOL or Java.
Those same APIs help mainframe applications migrate to a DevOps process. For DevOps, you want an environment where new functions require updates to the smallest number of programs possible, which limits the amount of testing required and speeds deployment.
By breaking the monolith up into these microservices, you've reduced the amount of testing required for each change. By creating APIs, you've now allowed it to integrate more easily into modern test processes.
Reap the benefits
Modernizing core applications means you can take advantage of all the benefits of modern programming environments—DevOps, continuous integration and continuous delivery (CI/CD), and modern front ends. You can take advantage of all that without having to rewrite your entire core infrastructure.
That's a big benefit. It allows organizations to deploy new functions without having to rewrite all of those monoliths in a new programming language, which can cost millions of dollars only to deliver the exact same customer experience in the end.
You wouldn't tear down your entire house just to build an exact replica with new, modern construction material. That's obviously a waste of money. Similarly, if the applications are still delivering value to the business, you shouldn’t be looking to rewrite them. If you need new features, you refactor and update the code as necessary, just as you would remodel a home when needed.
The moral of the story: You can remodel your code to better interface with new programming standards, new techniques, and new technologies without rebuilding it from the ground up.
Keep learning
Take a deep dive into the state of quality with TechBeacon's Guide. Plus: Download the free World Quality Report 2022-23.
Put performance engineering into practice with these top 10 performance engineering techniques that work.
Find to tools you need with TechBeacon's Buyer's Guide for Selecting Software Test Automation Tools.
Discover best practices for reducing software defects with TechBeacon's Guide.
- Take your testing career to the next level. TechBeacon's Careers Topic Center provides expert advice to prepare you for your next move.