The Journey from JCL to Python: so easy even an old mainframer can do it.

Frank J. De Gilio
Theropod
Published in
14 min readAug 17, 2021
Grinding out new code

🖥 🖥 🖥

I had been thinking about replacing JCL with some Python code. I knew it was possible, but wanted to figure out how to do it using the Z Open Automation Utilities (ZOAU version 1.1.0). I wanted to create it in a way that the code could be used as a template for anyone who wanted to make the changes themselves. What follows is code that I have written that has been vetted by several Python experts (by several I mean too many to mention) following an iterative approach that grows in complexity.

JCL vs Python: A cultural difference

I took a simple JCL file (14 cards/lines long) and created the Python equivalent. Turns out the Python version of the same capability was almost 200 lines long! You are probably thinking, “This obviously demonstrates the simplicity of JCL over Python,” but the answer is a bit more complex than that.

The simplicity of JCL is based on the fact that JCL is just code that we manage. The code itself is not complex, but it can be intricate. Think about it, when was the last time you created JCL from scratch? Most of us bring up a JCL member, modify a couple of items, and submit the job. If you make a mistake, you go over to the System Display and Search Facility (SDSF) and look at the output, figure out the error, go back to the editor, change the code, and submit again.

The Python code is based on a totally different model. Reviewing the output, figuring out the error and modifying the JCL code steps are unnecessary in Python because the user never touches the code. Instead, the code verifies the data supplied to it and performs the functions. In JCL all verification is done by the program itself. While we could simplify our Python to similarly allow the program to provide all the checking, it seems faster, simpler and more user focused to try to validate the input prior to calling the program. There is quite a bit of precedence for such a model. We do it all the time in web-based models. Instead of filling in a webform, clicking a submit button and waiting for the web to return an answer, input verifiers ensure that what gets sent is more likely to work. With this model users only need to provide configuration files to the script (also known as a module) and parameters via the command line and the Python code does everything else! This provides a much more streamlined approach at each execution.

This difference combined with the fact that Python executes synchronously (as opposed to an asynchronous JES environment) fuels a completely different approach to using z/OS. This model makes it easier for z/OS to play with other systems in DevOps pipelines and Cloud deployments while obviating the need to jump through multiple screens to check for errors.

Going from JCL to Python: The First Step

People seem to know either JCL or Python and seem to be leery of going from one to the other because of the unknowns involved. I am hardly an expert in either, so imagine the amount of trepidation I had taking on such a task. Let me take you through the process I went through when making the transition. I must admit that many people coached me on Pythonic style, so this Python code has been vetted by experts to make it as high in quality as possible. In addition to the IBM Open Enterprise SDK for Python, this code relies on the Z Open Automation Utilities (ZOAU).

The first step is to create Python that handles the base execution of the JCL. Let’s use a simple SMPE use case. Here is some JCL to List contents of a zone in SMPE:

//SMPLIST  JOB ,,MSGLEVEL=1,MSGCLASS=H,CLASS=A,REGION=0M,
// NOTIFY=&SYSUID
//*******************************************************
//SMPLST EXEC PGM=GIMSMP
//SMCSI DD DSN=AS4SMP.GLOBAL.CSI,DISP=SHR
//SMPLOG DD DUMMY
//SMPLOGA DD DUMMY
//SMPWRK6 DD DSN=&&TEMP,
// SPACE=(CYL,(2500,100,500)),VOL=SER=USRAT5,UNIT=3390,
// DCB=(DSORG=PO,RECFM=FB,LRECL=80,BLKSIZE=3200),
// DISP=(NEW,DELETE)
//SMPCNTL DD *
SET BDY(GLOBAL).
LIST.

An SMPE expert would bring up this JCL and modify it to change the zone she/he wished to list, and might add parameters to the list statement to provide only the output she/he was looking for. Let’s start to turn this JCL into Python. The first step is to specify (import in Python parlance) the modules used in our Python script.

import sys
import os
from zoautil_py import mvscmd, datasets
from zoautil_py_types import DDStatement, DatasetDefinition,
FileDefinition

These imports make capabilities available to our script. The first two are built-in Python modules for connecting our script to the system. The other two give our script access to specific methods and data types in the ZOAU utilities.

Now we define the data definition (DD) statements in Python. Each DD statement will be defined using ZOAU functions that will be built into a list. The SMPLOG and SMPLOGA are simple and would look like this:

dd_list = []
dd_list.append(DDStatement(“SMPLOG”, “DUMMY”))
dd_list.append(DDStatement(“SMPLOGA”, “DUMMY”))

We start by creating a list of DD statements and fill that list in with DD cards. We create DD cards using the ZOAU DDStatement method. These two statements are simple since they are not really connecting to anything. The next step is to setup the Consolidated Software Inventory (CSI). We are using the ZOAU DatasetDefinition and the DDStatement methods to return the definition of our global CSI data set. Then connecting that DatasetDefinition to our DD name (AS4SMP.GLOBAL.CSI) creates a DDstatement for SMPCSI.

dd_list.append(DDStatement(“SMPCSI”,     DatasetDefinition(“AS4SMP.GLOBAL.CSI”)))

The next DD card is a bit different. In JES we can identify a temporary dataset and expect it to create the dataset and delete it when we are finished with it. Our Python program is going to have to manage that instead. Here are a few statements covering this:

temp_dataset_name = Datasets.tmp_name(“ASLAN”)
Datasets.create(temp_dataset_name, type=”PDS”,
primary_space=”5M”, secondary_space=”5M”, block_size=3200,
record_format=”FB”, record_length=80, volumes=”USRAT5”,
directory_blocks=10)
dd_list.append(DDStatement(“SMPWRK6”,
DatasetDefintion(temp_dataset_name)))

The first line above uses the ZOAU function Datasets.tmp_name to create a temporary dataset name. It uses the high-level qualifier of my user id (ASLAN). The second line creates a dataset with temp_dataset_name created on the previous line. Finally, the third line connects that newly created dataset (temp_dataset_name) to the DD name SMPWK6 and adds it to the dd list.

Now things get a bit tricky. The input for the program for GIMSMP comes from the SMPCNTL card. In the JCL we have it inline. To do something similar for Python we will create a temporary file in the Unix Systems Services (USS) environment that we can get rid of later. We must make sure that the code is in EBCDIC since GIMSMP is going to expect EBCIDIC input cards in SMPCNTL.

We start by defining the file name (input_file_name). We open the file for write ensuring that the encoding of the data is in EBCDIC (code page 1047) and write two records ended by newline characters. This part can be tricky and depends on how the program expects the data to come out. Since the input is expected to be JCL, some programs expect 80-byte records (remember nothing in column 72). This program is not that picky, so we are good.

input_file_name = “./SMPCNTL.input”
with open(input_file_name, mode=”w”, encoding=”cp1047”) as “file”:
file.write(“SET BDY(GLOBAL).\n”)
file.write(“LIST.\n”)
dd_list.append(DDStatement(“SMPCNTL”,
FileDefinition(input_file_name)))

The next challenge is once again because we are not relying on JES to handle the job. If we want to be able to see the output, we must create a dataset to handle the output. We didn’t have to do this in JCL because JES handles it for us. Most of this is like the temporary data set we created. We define a temporary name, create a dataset, create a dd statement, and append it to our dd list.

output_dataset_name = Datasets.tmp_name(“ASLAN”)
Datasets.create(outptut_dataset_name, type=”SEQ”,
primary_space=”5M”, secondary_space=”5M”,
volumes=”USRAT5”)
dd_list.append(DDStatement(“SMPLIST”,
DatasetDefintion(outptut_dataset_name)))

We had to do some research to find out where GIMSMP put its output. That information can be found in generic samples in SYS1.SAMPLIB or wherever your installation puts samples. Of course, we don’t know what that filename is, so it might be nice to tell the user where the output will show up. We did this with a printf.

print(f”Output will be in: {output_dataset_name}\n”)

Since GIMSMP is an authorized program, we need to use the ZOAU function mvscmd.execute_authorized. Otherwise we would just use mvscmd.execute. At this point we have defined all the DDs and need to execute GIMSMP so next we have an execute command.

command_return = mvscmd.execute_authorized(pgm=”GIMSMP”, dd_list)

At this point we have enough code to match the JCL. That is 16 lines of Python vs 14 cards/lines in JCL, which is pretty close. Of course, we should deal with the return code information. The return information from mvscmd.execute and mvscomd.execute_authorized comes in the form of an object that needs to be unpacked. We need to break apart the object returned from the command into a Python dictionary. Then we can deal with the information from the call. Once the dictionary is created, we can pull out the return code and determine if everything worked.

command_return_dictionary=command_return.todict()
if command_return_dictionary[“rc”] > 0:
sys.stderr.write(f”Return code:
{command_return_dictionary[“rc”]}\n”)

Don’t forget we have to erase the temporary dataset and input file:

Datasets.delete(temporary_dataset_name)
os.remove(input_file_name)

If you created a Python file with these statements, you now created in Python what happens in JCL. Of course, this really doesn’t represent the cultural change discussed earlier, it merely recreates the JCL in Python. Let’s revisit the code now focusing on removing the need to modify the code to perform different SMPE list functions.

Creating a more flexible smpe_list.py executable

Let’s create a function that is focused on doing what the JCL does. This allows us to build a level of abstraction. Once abstracted, we can add more programmatic support for users. First, we start with a routine that performs the List capability:

def smpe_list():
dd_list = []
dd_list.append(DDStatement(“SMPLOG”, “DUMMY”))
dd_list.append(DDStatement(“SMPLOGA”, “DUMMY”))
dd_list.append(DDStatement(“SMPCSI”,
DatasetDefinition(“AS4SMP.GLOBAL.CSI”)))
temp_dataset_name = Datasets.tmp_name( “ASLAN”)
Datasets.create(temp_dataset_name, type=”PDS”,
primary_space=”5M”, secondary_space=”5M”,
block_size=3200,record_format=”FB”,
record_length=80, volumes=”USRAT5”,
directory_blocks=10)
dd_list.append(DDStatement(“SMPWRK6”,
DatasetDefintion(temp_dataset_name)))
input_file_name = “./SMPCNTL.input”
with open(input_file_name, mode=”w”,
encoding=”cp1047”) as “file”:
file.write(“SET BDY(GLOBAL).\n”)
file.write(“LIST.\n”)
dd_list.append(DDStatement(“SMPCNTL”,
FileDefinition(input_file_name)))
output_dataset_name = Datasets.tmp_name(“ASLAN”)

Datasets.create(outptut_dataset_name, type=”SEQ”,
primary_space=”5M”, secondary_space=”5M”,
volumes=”USRAT5”)
dd_list.append(DDStatement(“SMPLIST”,
DatasetDefintion(outptut_dataset_name)))

print(f”Output will be in: {output_dataset_name}\n”)
command_return = mvscmd.execute_authorized(pgm=”GIMSMP”,
dd_list)
command_return_dictionary=command_return.todict()
if command_return_dictionary[“rc”] > 0:
sys.stderr.write(f”Return code
{command_return_dictionary[“rc”]}\n”)
Datasets.delete(temporary_dataset_name)
os.remove(input_file_name)
return command_return

Before we go much further, let’s try to be a bit more pythonic in our style, starting with a try-except-finally block. This allows the flow to be a bit more straightforward. If everything works correctly, we don’t have to worry about the error path. By creating a finally block, the file close actions happen even if the code generates an error. The new version looks like this:

def smpe_list():
dd_list = []
temp_dataset = None
sysin_file_name = None
try:
dd_list.append(DDStatement(“SMPLOG”, “DUMMY”))
dd_list.append(DDStatement(“SMPLOGA”, “DUMMY”))
dd_list.append(DDStatement(“SMPCSI”,
DatasetDefinition(“AS4SMP.GLOBAL.CSI”)))
temp_dataset_name = Datasets.tmp_name(“ASLAN”)
Datasets.create(temp_dataset_name, type=”PDS”,
primary_space=”5M”, secondary_space=”5M”,
block_size=3200,record_format=”FB”,
record_length=80, volumes=”USRAT5”,
directory_blocks=10)
dd_list.append(DDStatement(“SMPWRK6”,
DatasetDefintion(temp_dataset_name)))
input_file_name = “./SMPCNTL.input”
with open(input_file_name, mode=”w”,
encoding=”cp1047”) as “file”:
file.write(“SET BDY(GLOBAL).\n”)
file.write(“LIST.\n”)
dd_list.append(DDStatement(“SMPCNTL”,
FileDefinition(input_file_name)))
output_dataset_name = Datasets.tmp_name(“ASLAN”)

Datasets.create(outptut_dataset_name, type=”SEQ”,
primary_space=”5M”, secondary_space=”5M”,
volumes=”USRAT5”)
dd_list.append(DDStatement(“SMPLIST”,
DatasetDefintion(outptut_dataset_name)))

command_return = mvscmd.execute_authorized(pgm=”GIMSMP”,
dd_list)
except:
sys.write.stderr(“Error processing the command
environment.\n”)
sys.write.stderr(f”Exception information: {e}\n”)
finally:
if temp_dataset:
datasets.delete(temp_dataset_name)
if sysin_file_name is not None:
os.remove(sysin_file_name)
command_return_dictionary=command_return.todict()
if command_return_dictionary[“rc”] > 0:
sys.stderr.write(f”Return code
{command_return_dictionary[“rc”]}\n”)

print(f”Output will be in: {output_dataset_name}\n”)
return command_return

You will note that in addition to adding the try-except-finally block there were some other changes. The script now makes sure that the temporary dataset exists before deleting it. Similarly, it checks to see if the input file exists before deleting it. We also moved the user message to the end of the routine. It now is getting closer to the final product, but it needs a way to be invoked. Time to add a way to call the routine:

def main():
result=smpe_list().todict()
if result[“rc”] > 0:
sys.stderr.write(f”Error: Return Code: {result[‘rc’]}\n”)
if result[“stderr”]:
sys.stderr.write(f”{result[‘stderr’]}\n”)
sys.stderr.write(f”Message from the system:\n {result}\n”)
if __name__ == “__main__”:
main()

We created a main that will call our routine. It handles errors that are returned from the environment if the command executed without a problem, but the executed command generated an error. We also tell Python to run that main function if this module is run by itself. This way the file can be imported and used by a different program.

Eliminating hard coded parameters

As we move from modifying code in place (JCL) to providing static code with appropriate inputs (Python) we need to make a distinction between two different types of input to the module. There are a set of global inputs that are generally going to be consistent across multiple invocations and there will be options that can be different with each invocation which will more likely be focused on each specific request. Of course, the best possible solution is to create a base set of global inputs that can be overridden by options. In this article we focus on providing examples of providing parameters and input files but not the override as this will be an obvious addition that can be performed by the reader.

Let’s begin by focusing on providing parameters to our routines. Our previous invocation of the list assumed that we would only be interested in listing the global zone. It also assumed we wanted a full list. Finally, it had an arbitrary hard coded high-level qualifier (HLQ) for our temporary and output datasets. Let’s add three parameters (zone, options and HLQ) to our list routine (changed sections of the code are in bold font):

def smpe_list(target_zone=”GLOBAL”, list_options=None,
high_level_qualifier=”SYS1”):
dd_list = []
temp_dataset = None
sysin_file_name = None
try:
dd_list.append(DDStatement(“SMPLOG”, “DUMMY”))
dd_list.append(DDStatement(“SMPLOGA”, “DUMMY”))
dd_list.append(DDStatement(“SMPCSI”,
DatasetDefinition(“AS4SMP.GLOBAL.CSI”)))
temp_dataset_name = Datasets.tmp_name(high_level_qualifier)
Datasets.create(temp_dataset_name, type=”PDS”,
primary_space=”5M”, secondary_space=”5M”,
block_size=3200,record_format=”FB”,
record_length=80, volumes=”USRAT5”,
directory_blocks=10)
dd_list.append(DDStatement(“SMPWRK6”,
DatasetDefintion(temp_dataset_name)))
input_file_name = “./SMPCNTL.input”
with open(input_file_name, mode=”w”,
encoding=”cp1047”) as “file”:
file.write(f“SET BDY({target_zone}).\n”)
if list_options is None:
file.write(“LIST.\n”)
else:
file.write(f"LIST {list_options}.\n")
dd_list.append(DDStatement(“SMPCNTL”,
FileDefinition(input_file_name)))
output_dataset_name = Datasets.tmp_name(high_level_qualifier)

Datasets.create(outptut_dataset_name, type=”SEQ”,
primary_space=”5M”, secondary_space=”5M”,
volumes=”USRAT5”)
dd_list.append(DDStatement(“SMPLIST”,
DatasetDefintion(outptut_dataset_name)))

command_return = mvscmd.execute_authorized(pgm=”GIMSMP”,
dd_list)
except:
sys.write.stderr(“Error processing the command
environment.\n”)
sys.write.stderr(f”Exception information: {e}\n’)
finally:
if temp_dataset:
datasets.delete(temp_dataset_name)
if sysin_file_name is not None:
os.remove(sysin_file_name)
command_return_dictionary=command_return.todict()
if command_return_dictionary[“rc”] > 0:
sys.stderr.write(f”Return code
{command_return_dictionary[“rc”]}\n”)

print(f”Output will be in: {output_dataset_name}\n”)
return command_return

Our calling routine (main) must change to support calling the routine with arguments. Rather than having to code support and documentation for those arguments, we can take advantage of a Python package that already does this. We installed argparse using Python Install Package tool (pip). Now we can include argparse package at the top of our file:

import argparse

Now all we need to do is define a parse_args routine that allows us to define the arguments we expect, simple documentation associated with those args, and the parsing of input passed to the module.

def parse_args(argv=None):    program_name = os.path.basename(sys.argv[0])
if argv is None:
argv = sys.argv[1:]

try:
parser = argparse.ArgumentParser(program_name)
parser.add_argument("hlq",
help="The High Level Qualifier to be used.")
parser.add_argument("-z", "--zone", default="GLOBAL",
help="The target zone to be queried.")
parser.add_argument("-o", "--options", default=None,
help="Any list options to be added")

opts = parser.parse_args(argv)
return opts
except Exception as e:
indent = len(program_name) * " "
sys.stderr.write(program_name + ": " + repr(e) + "\n")
sys.stderr.write(indent + " for help use --help")
sys.exit(1)

Since we have defined a way to handle arguments, we can fix our main routine to deal with arguments and call out smpe_list routine (again changes are in bold font):

def main():    args = parse_args()
result = smpe_list(args.zone, args.options, args.hlq).to_dict()

if result["rc"] > 0:
sys.stderr.write(f"Return Code: {result['rc']}\n")
if result["stderr_response"]:
sys.stderr.write(f"{result['stderr_response']}\n")
sys.stderr.write(f"Message from the system:\n{result}\n")

One more step to completion

Our module is really starting to come together. We now have a routine that allows users to define what zone they want to list, any options to list function they want, and the high-level qualifier for their temporary and output dataset. This will allow us to keep the code standard and provide invocation specific parameters.

Of course, there are still some hard coded data in our module, but that data probably will be consistent across invocations. This minimizes the amount of change the module needs, but it still requires the code to be modified from time to time to define those less fluctuating hard coded variables. Of course, we could also parameterize those but that means the caller needs to know all the information every time they want to invoke the module. Most of that extraneous information needs to be defined once and really isn’t information that the caller needs to remember.

It makes more sense to keep that information modifiable but not part of the caller’s normal experience. That information can be stored outside of the execution of the module and defined in a configuration file that can be set at installation time and modified from time to time as needed without ever touching the program.

A common way to store “configuration information” is in a YAML file. This format allows us to create a fairly simple way to define the defaults we want our module to use. The neat thing about YAML is that it is easily readable by both humans and computers. This is what our YAML file would look like:

SMPECSI:
dataset: AS4SMP.GLOBAL.CSI
TEMP_DATASET:
primary_space: 5M
secondary_space: 5M
volume: USRAT5
OUTPUT_DATASET:
primary_space: 5M
secondary_space: 5M
volume: USRAT5
SMPECNTL:
filename: /tmp/smpcntl.input

You can easily see the format of the file. We have a high-level definition of each set of related material. We have definitions for our SMPESCI, temporary dataset, and output dataset. Our first step is to use pip to install the package that supports yaml. Once that is done we add an import for YAML support to the top of our module:

import yaml

Now we need a routine that reads our YAML file and creates the defaults. This routine not only loads the YAML data into a default list of dictionaries, it also checks to make sure that the required data is in the file. The value of the imported YAML module is that there is little code needed to take what is in a file and create Python structures that our module can use. This makes our routine pretty simple:

def get_defaults(filename):    required_keys = [("SMPECSI","dataset"),
("TEMP_DATASET","primary_space"),
("TEMP_DATASET","secondary_space"),
("TEMP_DATASET","volume"),
("SMPECNTL","filename"),
("OUTPUT_DATASET","primary_space"),
("OUTPUT_DATASET","secondary_space"),
("OUTPUT_DATASET","volume")]
with open(filename) as file:
defaults = yaml.load(file, Loader=yaml.FullLoader)
for dataset,key in required_keys:
if dataset in defaults.keys():
if key not in defaults[dataset]:
sys.exit(f"Yaml file missing {dataset}:{key}\n")
return defaults

Now we make the last set of changes to our code to take advantage of the defaults retrieved in the get defaults routine (the changes are in bold font):

def smpe_list(target_zone=”GLOBAL”, list_options=None,
high_level_qualifier=”SYS1”):
dd_list = []
temp_dataset = None
sysin_file_name = None
defaults = get_defaults("./SMPElistDefaults.yaml")
try:
dd_list.append(DDStatement(“SMPLOG”, “DUMMY”))
dd_list.append(DDStatement(“SMPLOGA”, “DUMMY”))
dd_list.append(DDStatement(“SMPCSI”,
DatasetDefinition(defaults["SMPECSI"]["dataset"])))

temp_dataset_name = Datasets.tmp_name(high_level_qualifier)
Datasets.create(temp_dataset_name, type=”PDS”,
primary_space=(defaults["TEMP_DATASET"]
["primary_space"]).strip(),
secondary_space=(defaults["TEMP_DATASET"]
["Secondary_space"]).strip(),
block_size=3200,record_format=”FB”,
record_length=80,
volumes=(defaults["TEMP_DATASET"]
["volume"]).strip(),
directory_blocks=10)

dd_list.append(DDStatement(“SMPWRK6”,
DatasetDefintion(temp_dataset_name)))
input_file_name = defaults["SMPECNTL"]["filename"]
with open(input_file_name, mode=”w”,
encoding=”cp1047”) as “file”:
file.write(f“SET BDY({target_zone}).\n”)
if list_options is None:
file.write(“LIST.\n”)
else:
file.write(f"LIST {list_options}.\n")
dd_list.append(DDStatement(“SMPCNTL”,
FileDefinition(input_file_name)))
output_dataset_name = Datasets.tmp_name(high_level_qualifier)

Datasets.create(outptut_dataset_name, type=”SEQ”,
primary_space=(defaults["OUTPUT_DATASET"]
["primary_space"]).strip(),
secondary_space=(defaults["OUTPUT_DATASET"]
["secondary_space"]).strip(),
volumes=(defaults["OUTPUT_DATASET"]
["volume"]).strip())

dd_list.append(DDStatement(“SMPLIST”,
DatasetDefintion(outptut_dataset_name)))

command_return = mvscmd.execute_authorized(pgm=”GIMSMP”,
dd_list)
except:
sys.write.stderr(“Error processing the command
environment.\n”)
sys.write.stderr(f”Exception information: {e}\n’)
finally:
if temp_dataset:
datasets.delete(temp_dataset_name)
if sysin_file_name is not None:
os.remove(sysin_file_name)
command_return_dictionary=command_return.todict()
if command_return_dictionary[“rc”] > 0:
sys.stderr.write(f”Return code
{command_return_dictionary[“rc”]}\n”)

print(f”Output will be in: {output_dataset_name}\n”)
return command_return

Now we have a reasonable replacement for our 14-line JCL. Our new Python version of the execution obviously uses more lines than JCL. These lines separate code from the input and data needed to run the code effectively. Additionally, the code checks input so the system doesn’t have to. Finally, it keeps people from the typos that happen when one modifies JCL prior to submission. This not only ensures that execution is less error free, but it also allows us to create a separation of concern with defaults set at configuration time and another set of options defined at execution time.

There are more changes one can make to the sample code to make it even cleaner. The addition of an environment variable can eliminate having a yaml file in the same directory as the code. We could add optional override options for default items the user might want to change on the fly. Each of these changes makes the module more flexible and aligns it closer with the way operations are managed in other environments.

If you need to replace JCL with a Python module, this code can be the base template for the conversion. Rather than cutting and pasting from this article, you can find the code in its entirety here: https://github.com/IBM/zoau/blob/main/samples/smpe_list.py

Imagine a world where your z/OS environment is managed by scripts instead of JCL. With the current capabilities and your ability to convert JCL into Python, that world is within your grasp.

--

--

Frank J. De Gilio
Theropod

Seasoned mainframer guy with delusions of grandeur