Application Modernization on mainframes: What are we talking about?

David Rice
Theropod
Published in
5 min readOct 26, 2022

🖥 🖥 🖥 Application Modernization on z/OS can be broken down into different subcategories/ Even still, it is a rather a big piece to chew on. To break it down to more digestible pieces, we will share some of our experiences with you.

Let’s start with something obvious: programming languages and the compiler versions.

It is often said that COBOL and PL/I are legacy languages and should be replaced, but that needs some further investigation.

We have encountered situations where an organization rewrote a COBOL application in Java and placed it off-platform, only to find out that the cost dramatically increased. After examining the situation, this organization found that the off-loaded application was extremely chatty to Db2. Getting data from Db2 for z/OS from the distributed world is not offloaded to the speciality processors (zIIPs) and therefore caused a significantly higher bill. Apart from a costly and risky project, the projected savings turned out to be negative, leaving this organization in serious financial troubles.

Let’s get to the point — we are not claiming that all the applications that run on IBM zSystems should stay there forever. Presentation layers and other workloads that aren’t involved in transactions are candidates to be examined for off-loading. That said, the transactional software that is written in COBOL or PL/I is so extremely efficient that it would be very difficult to re-write it in a language that out-performs the existing code. What can help to improve performance and security even further is the use of the latest compiler versions, something we don’t see every organization picking up. It might come as a surprise, but COBOL and PL/I compilers are in a continuous delivery model to make sure that source code can benefit from the latest technologies available in the underlying operating system and hardware. Improvements in response times and reduction of costs are some benefits, as well as security enhancements.

Middleware updates are another area that can help modernize an application. Just like the compiler versions, many of the new hardware enhancements can benefit the middleware layer of your application. This may seem like an obvious point, but there are many applications that are still running old versions of Db2, CICS or IMS that aren’t leveraging the latest features. This again leads to higher costs and latencies.

Another piece of low-hanging fruit is to utilize the features of specialty processors on the mainframe systems. For many years, we have Java workloads running on the z Integration Information Processors (zIIPs) making the mainframe by far the fastest Java runtime on the planet. What might be seen as a well-kept secret is that it is not limited to only Java . As mentioned above, newer compilers on z/OS also utilize a lot of the specialty engines to increase runtime performance. In certain cases, it can be advantageous to refactor some of the code to better utilize these specialty processors. These aren’t just limited to language support. For some time now there are Integrated Facilities for Linux (IFLs) processors available that can host Linux and z/OS Container Extensions (zCX) including RedHat OpenShift.[AG1] Having those accessible opens a lot of opportunities for more modern DevOps and end user experiences.

With the latest machine, the IBM z16, an Artificial Intelligence (AI) co-processor is embedded in the processor, making it possible to embed AI into response time sensitive transactions. This can enable customers to leverage the results of AI inference to better control the outcome of transactions before they complete.

Data Virtualization, Machine Learning and REST API enabled CICS/IMS transactions are other domains that can really help moving forward in digital transformation projects. Let’s concentrate on a real situation that we are doing internally at IBM. In one domain, we are running well over 600 applications on z/OS. Like all software, it needs to be updated, and like many of our customers, the original authors of this code often are no longer around to help out. For us to have better insight into our codebase, we are using the tools like the IBM Application Discovery and Delivery Intelligence (ADDI) to examine the source code. As we start utilizing these tools, they can be very helpful in indicating areas in the code that might be REST API eligible or are hard-coded data-access routines.

Recently we were involved in a situation at an organization where the business users were complaining about the performance and user interface of a web-application. We sat down with the owners of the chain of applications that, in the end, landed on a COBOL/CICS application. It seems that the inflexibility was caused by a 3270 screen-scraping application that was gathering data from a BMS map. Over time, the application owners wanted to add ‘new’ stuff like email-addresses but the code of the COBOL/CICS BMS map was lost (side note: before starting any modernization efforts, make sure you are using a modern Source Code Management system like git). The tricks they used to get email-addresses and other information into the end-user experience was impressive, but did not perform well, and the word “spaghetti” came across our minds many times.

With software scanning tools, we could identify the piece of code in the COBOL application, while skipping the BMS mapping. We were able to add fields into the COMMAREA and opened the application by means of a REST API. Performance improved significantly but more importantly; the path from user interface into the back-end application was so much neater, faster and err… more modern!

A second use case we bumped into was an organization that had, like many others, hardcoded the data access into their many COBOL modules. Whenever they wanted to change their Db2 schema (adding, deleting, altering a table) they needed to go through the many COBOL modules and change the code, test it and put in production. Sometimes it was too risky and too time consuming. We created one additional data access module in COBOL that created a proxy into the Db2 environment. Over time the COBOL modules could be altered to use the ‘new’ interface which gave the organization the flexibility to only change that proxy code whenever they wanted to alter the schema. Additionally, that proxy was made available by means of REST APIs giving yet another way of integrating into data on z/OS.

These are very simple examples, and the real hard work lies in the more complex applications. Yet, to start opening the platform could be by picking up a couple of simple projects like these and then building it out.

Authors:
Frank van der Wal— IBM ZTO Technical Leader IBM zSystems Northern European Market
David Rice — IBM CIO Office Lead Developer Advocate for z/OS

--

--