Mainframe on the Macbook

Marianne Bellotti
The Technical Archaeologist
4 min readMay 27, 2018

--

The first challenge when trying to teach yourself COBOL is figuring out how to install a language designed for mainframes on a modern machine. It could not be as easy as brew install cobol could it?

Turns out… yes, you can do something like that. Despite the fact that virtually no one learns COBOL anymore, a small group of programmers have been maintaining a suite of tools for COBOL on modern machines. GnuCOBOL’s (formerly OpenCOBOL) last stable release was five months ago. For Mac users like me it can be installed painlessly with the command brew install gnu-cobol.

But to be fair this doesn’t really install COBOL on your computer. GnuCOBOL is a transpiler that parses COBOL and converts it to C before compiling it. This feels a bit like cheating. I find myself pulled into philosophical debates about “what does it even mean to install a language in the first place?” COBOL was designed for interoperability, that was the whole point. So putting it on a computer via transpiler feels anti-climatic.

Many people erroneously believe that Grace Hopper invented COBOL as the first English based human readable programming language. In truth COBOL was designed by a government committee, cribbing heavily from Hopper’s work on Unisys’s FLOW-MATIC … to the point where on might argue that 80% to 90% of COBOL’s original syntax and structure were taken directly from Hopper’s FLOW-MATIC. There were other languages that COBOL drew from (mainly the Air Force’s AIMACO and IBM’s COMTRAM) but they were not operational at the time.

Like most things in government, the appeal of COBOL was saving money. High level programming languages were exploding. The same year COBOL’s first spec came out, six other programming languages were invented including this little appreciated thing called LISP. The year before three new languages were invented. The year before that, five new languages had been invented. While the nineties and early aughts saw similar periods of growth these new languages of the fifties were tied to their hardware. Imagine trying to do business in an environment where 14 new operating systems are invented within a three year period, each one costing thousands of dollars and you had no way of knowing whether the one you chose would survive.

The government (and more specifically the military) was spending hundreds of million dollars writing different programs for one computer system then rewriting the same program for a different computer system. At the DoD alone expenses were climbing over $2 billion in today’s money. COBOL was seen as a solution to that: one language that all computers could understand. At the time this was pretty innovative. No one had seriously considered portability as an element of a language. Even LISP’s first implementation was tied closely to IBM’s 740.

The irony of this situation being that the government now spends billions of dollars maintaining COBOL programs instead.

The internal politics that surrounded the development of COBOL are interesting in their own right and will be the topic of another post, but when it comes to the ambitious goal of portability it was the government’s ability to spend millions and billions of dollar rather than their desire to save it that made it happen.

Once COBOL’s early drafts were done, DoD simply told the computer industry that they would not buy any computer that did not support COBOL. While all the major manufactures at the time (Burroughs, IBM, Honeywell Labs, RCA, Sperry Rand, and Sylvania) were working on their own high level languages, none of them were willing to forfeit the millions of dollars doing business with the DoD represented.

And once they all supported COBOL, COBOL then became the language that everyone else defaulted to as well.

For modern computers, though, to support COBOL means to compile to some other high level language first. Want to run COBOL on your Raspberry Pi? ElasticCOBOL will transpile it to Java for you. COBOLScript will convert it to Javascript. Here’s a small project to move COBOL to Ruby.

Philosophical arguments aside, transpiling means that for a new COBOL developer the hardest parts of the process are eliminated for you. You do not have to wrestle with mainframe emulators and you do not have to figure out which flavor of COBOL compiler works with that emulator.

If you wanted to do things the hard way, though, there are actually a couple of blog posts about turning Raspberry Pi’s into mainframes. Early mainframes after all had a fraction of the memory and processing power as devices sold to hobbyists. The same software (open sourced emulator Hercules) runs on Linux so could be used to turn any computer into a 1960s style mainframe if so desired.

And don’t think for a moment that entering the world of COBOL means giving up modern development tools. IBM has released a series of the plugins for Eclipse. There’s a COBOL specific IDE written in Python. Someone has even figured out how to scan COBOL code with Sonar and integrate it with Jenkins.

Some of which will probably come in handy when I set about my first COBOL coding challenge: math. Stay tuned!

--

--

The Technical Archaeologist
The Technical Archaeologist

Published in The Technical Archaeologist

Tracing the legacy of old computer systems from mainframes to that BS angular app you built in 2005. Retro computer art by macrovector courtesy of freepik.com

Marianne Bellotti
Marianne Bellotti

Written by Marianne Bellotti

Author of Kill It with Fire Manage Aging Computer Systems (and Future Proof Modern Ones)

Responses (1)