How to Empower Application Teams With Automated Db2 Testing

Over the last several blogs in this series we have discussed Broadcom Mainframe Software’s embracing open shift-left approach, providing developers self service capabilities for Db2 schema changes to remove some burden from the mainframe Db2 DBA as well as managing DDL as source code.

We are now going to discuss data and schema provisioning in test and automated testing tools. The importance of automation cannot be overstated. It is one of the primary drivers in being able to shift left to reduce costs, and is key to implementing CI/CD to support DevOps. Beyond automation for orchestration, we need test automation. For test automation to assure product quality, we need to have realistic data available.

In addition to schema provisioning, you can use our plugin to script the migration of data and schemas from one Db2 for z/OS subsystem into another using these dbm-db2 Zowe CLI plugin commands:

  • zowe dbm prepare migration. This command prepares a scenario for migration of a list of objects from one Db2 subsystem to another. It also generates the DDL as it will be at the target after execution.
  • zowe dbm execute migration-script. This command executes the “migration-script” to actually migrate Db2 objects from one Db2 to another — this includes both schema and data. Schema can be changed during the migration.

You can make these commands part of your automated pre- and post-processing scripts that can be used during development so the developer can ensure their changes are working against realistic data — and in a more robust integrated application automated testing environment — before you push your changes to production. You can register to participate in the dbm-db2 Zowe CLI plugin validation here and download the pre-GA code to try at your site today!

Now you can migrate Db2 data and schemas to your test environment — but what tools can you use to automate this testing? Well, there are many available to you in your organization’s distributed DevOps environments now.

What you also need to be able to do is manipulate data on the mainframe to ensure you can test in that environment. A suggestion for this job is Test4z. Test4z enables you to call REST APIs to submit a batch job and perform data manipulation of z/OS datasets like compare, search, create, snapshot, and update. You can include these calls in your automation framework test scripts — submit a batch job to run a test and check the resultant dataset output, for example.

For an example of how to implement automated testing with Db2, take a look at this Test4z Db2 test script on github. You can engage mainframe experts initially to set up the batch jobs on the mainframe that need to be run to assure application quality. Once set up, the application team can then have these batch jobs triggered via Test4z and automation frameworks as their code changes move through the DevOps pipeline. In this way you empower the Application teams, have assured the correct testing is being performed, and freed up time of the mainframe experts.

An additional approach to consider is to add automated tests leveraging Endevor processors as a first step or as a complement to the Open-first approach for Mainframe test automation provided by Test4z as described here.

You can also use Broadcom’s Mainframe Application Tuner (MAT). We know that an automated application pipeline typically starts with a developer finishing changes, committing code, then performing a build. Automation then deploys the executables to the test environment, and automated testing is run. You can add a benchmarking check at this point to ensure that you don’t introduce a performance issue.

The MAT Detect CLI will help you to script the performance check into your pipeline, so you can notify your team when there’s an issue. For example, you can send an email alert.

When the development team knows that the latest commit has introduced an issue, they can use the MAT Analyze plugin to diagnose the issue and determine where in the application code the problem lies.

Also, Broadcom MAT includes an open source plugin which helps you to easily add this benchmarking stage into your Jenkins pipeline. It will install or refresh the needed components for automation, create email reports and notifications, and let you start getting value of the early performance benchmarking with a few clicks.

So now you know at a high level how to automate the provisioning of schemas to enable developers while maintaining DBA control, manage DDL as source code, migrate data as well as schemas for test, integrate mainframe data manipulation into automated tests, and use Broadcom MAT in your pipeline to detect performance issues before you get to production.

In our next blog we will be discussing how to elevate Db2 Monitoring to SRE Tooling in a DevOps environment. So that we know the code we are pushing through to production does not cause any performance degradation — or when it does, we know about it ASAP.

You can learn more about our Broadcom Db2 for z/OS DevOps story here.

Please reach out to our Db2 for z/OS experts using Db2-Experts.pdl@broadcom.com with any questions you may have specific to Db2 DevOps.

Hope you are well and staying safe!

--

--

Modern Mainframe
Modern Mainframe

Published in Modern Mainframe

Mainframe modernization for the next generation of application developers using Visual Studio Code (VS Code), Git, and DevOps automation — automated build, test, deployment of COBOL applications & more. Features articles on Code4z, Test4z, Team Build and Endevor Bridge for Git.

John Benbow
John Benbow

Written by John Benbow

Sr Manager Product Owners Data Management Value Stream @broadcom