Building Mainframe Metal C and Testing with Jest and Zowe CLI

Dan Kelosky
Zowe
Published in
5 min readJan 25, 2019
Metal C (with inline assembly) and sample test report

Here I’ll show an example metal C project and how you might incorporate a non-z/OS testing framework for metal C code. Presumably, you could apply this approach to pure assembler, COBOL, or PL/1.

Before sharing testing details, I’ll describe the build process that I’m using for metal C. If you prefer, skip straight to “Testing Use Case” below.

Starting Point

In a previous post and one even further back, I described a “Hello World” IBM assembler project. Within the project, I created npm scripts to wrap Zowe CLI:

  1. $ npm run allocate → allocate z/OS data sets
  2. $ npm run upload → deploy source code to the z/OS data sets (from Git/GitHub)
  3. $ npm run jcl → build and execute code (using templated JCL)

All build and customization for the project is done through a JSON config file (shown below).

This same setup will be the base for my metal C project.

Project Improvements

I updated the project config for metal C, and I made a few other changes to the base project:

  • handlebars replaced mustache for extra capabilities
  • the JSON config is now a TypeScript JSON config (for comments, trailing commas, and imports)
  • $ npm run upload now accepts arguments to upload a subset of source files, e.g.:$ npm run upload -- cpgm/template.c chdr will only upload template.c and chdr files
  • a deploy script was added to FTP (via JCL) load modules to other systems
  • a convert script was added to create C structs from assembler DSECTs
  • nodemon was added to upload source to z/OS on an editor save

Here’s the new developer experience: editing metal C on my PC, making a change, and uploading to z/OS with a background task$ npm run watch:

Making a change and saving a C file automatically runs `npm run upload` to transfer my source to z/OS data sets

In the future, I could extend the “watch” process to build and test on save.

Build Configuration

Build dependencies and options are under the control of a JSON config. For example, ams.c contains code to do I/O in metal C. To compile, assemble, and link this program, I add it (without the file extension, AMS) to my JSON config:

It’s kind of like tsconfig for z/OS source (I’m a fan of this model 😃).

Testing Use Case

I have a file,ams.c, in my project that was built to perform I/O operations in metal C. It contains functions:

  • openOutputAssert → open a DCB for ouput and abend if open fails
  • openInputAssert → open a DCB for input and abend if open fails
  • readSync → read in a record from an open DCB
  • writeSync → write a record from an open DCB
  • closeAssert → close a DCB

To test my functions, I created a test driver program to call them (amstest.c):

Metal C code to open input and output DBCs, write input contents to output file, then close

This code is using BSAM under the hood and must run on z/OS, but my tests don’t have to.

Testing Frameworks

Before writing tests using amstest.c, I’ll need a testing framework. That is, something where I can write individual test cases (with assertions), run them, and get a report about what works and what doesn’t.

Here’s a skeleton example of what I’m referring to using a Jest test suite:

Skeleton Jest test suite

The outer describe is effectively a comment or note on what the tests are doing. Each it is test case and will read aloud in a way that describes the test, e.g “it should assemble…”.

I’m already in a Node.js / npm environment in my project, so I’m planning to stick with a TypeScript/JavaScript testing framework. Alternatively, I could accomplish something similar using Python, Java, or Ruby.

It turns out that there are lots of Typescript/JavaScript testing frameworks, but Jest seems to work great for what I need (I love snapshots). I’ve already tried Mocha in the past, but you can’t use arrow functions with it and Jest seems a bit faster.

Dependencies for Jest

Jest has a “getting started” doc and a section for TypeScript which merely tells you to use the ts-jest’s “getting started”. I followed ts-jest’s table under “using npm” which had me add 4 new packages to my project:

  • TypeScript → TypeScript compiler (or transpiler)
  • jest → the core testing framework
  • @types/jest → TypeScript type definitions (Jest is written in pure JavaScript)
  • ts-jest → lets me write and run tests in TypeScript (not JavaScript) by invoking the Typescript compiler behind the scenes

Lastly, running npx ts-jest config:init creates a jest.config.js file which configures Jest to use ts-jest. (npxis a tool — bundled with Node.js — to execute a locally installed npm module from node_modules instead of a globally installed, system-wide version).

Test Setup

Now I can begin to write tests. To keep things as simple as possible, my plan is to construct a test batch job that will repeatedly execute my test program with different inputs. I’ll create some template JCL:

Test JCL with handlebars templates

Then I used Jest’s beforeAll to render and submit this JCL.

beforeAll generates JCL using npm commands and submits /downloads output with Zowe CLI

Line 8 above is an array of inputs to my test program. For each entry I add to this array, my JCL will be constructed with EXEC PGM=AMSTEST, and the corresponding array data will be input into the program.

In other words, to test with different input data (longer values, shorter values, special characters, etc…), I just add entries to my test array.

Writing Tests

With setup done, I can write tests. These are a little contrived, but the goal is to convey how this approach would work (not to write perfect tests 😏):

Line 2 and line 9 are the two tests. Line 2 asserts that for the original test array input exactly matches the output of the test program.

For the test on line 9, I updated the C test driver to SNAP a control block and assert that the control block doesn’t change unexpectedly.

Running the jest CLI would kick off the tests, but I wrap this behind a package.json script, $ npm run test:

Command line response from my jest run

All tests pass.

Test Failure Example

I can simulate a test failure by forcing a control block overlay, and with a basic Jest HTML reporter, I can immediately see the failure and where the unexpected change happened. Here’s how a test failure is displayed:

A code change to force a simulated control block overlay causes tests to fail (and a diff rendered for where the failure was)

Summary

For production usage, I’d want to add more tests and provide for more ways to exercise the I/O functions. For example, I might want to accept recfm, blksize, and lrecl as input parms to the test program and quickly try against different combinations of those values.

In the end, it’s an example Metal C project that is using a JavaScript-based testing framework - Jest, which runs off z/OS - to test metal C code that must run on z/OS.

The full project is here.

--

--

Dan Kelosky
Zowe
Writer for

Likes programming/automation in mainframe (assembler, C/C++), distributed (Node.js), and web development (Firebase, Angular).