Last place in the English dictionary is nothing to sneer at, nor are the 25 points up for grabs in Scrabble by laying down the Z, both Ys, and a clutch of lesser tiles as “ZYMURGY”. It has a definition, too: it’s the science of fermentation, and despite its formidable Greek roots it’s a relatively recent invention.
“Zymurgy” came about as these things usually do. The French scientist Louis Pasteur was fermenting sugar into alcohol in his lab when it dawned on him that the process also produced new yeast cells. This realization was novel to the microbiology of the time, and from “zyme-“ (leaven) and “-ourgia” (a working) came a new word to suit. One swift redefinition, and an air of scientific respectability (if not the faintest whiff of pretension) fell upon the practice formerly known as “brewing”.
Lacking words for a particular problem has historically been no problem at all. Sometimes we plunder Latin, Greek, or the local vernacular; often we simply make something up. Occasionally a common word will simply take on a new meaning (as a “breeder” at the Westminster Dog Show or the American Nuclear Society). But whatever the source, jargon supplies those “in the know” with both a proxy for expertise (“as you can see from my big words, I’m aware of the problem if not the solution”) and a nifty tool for elevating themselves above the lay people.
Of course, jargon isn’t limited to the sciences. Lawyers banter on in Latin (inter rusticos), while medical professionals prefer Greek (cyberchondria). In software development, many terms are borrowed from mathematics, and while some are fundamental (try explaining “function” without using the word) or inescapable (an integer is not a floating point number, to many a systems programmer’s chagrin), others we could do without.
Like one of my favorites: “code.”
Think about it. When we discuss the technologies that permeate our lives, we talk about software applications — apps — and the devices we use to run them. Apps are present, familiar, and relatable. We use them every day to connect with friends, read the news, order dinner, and watch cats fall off of complete strangers’ countertops.
But a strange asymmetry emerges when we talk about how apps are created. Suddenly, the step-by-step instructions within a software application become code. In a clever linguistic flourish we detach a familiar end product (“our apps”) from an arcane production process (“their code”), showering unnecessary mystery on what ought to be a rather mundane project.
First, the mystery. The multitude of high-level languages and the wealth of online resources available to teach them ought to make software development an incredibly accessible enterprise. Language developers strive for simple, understandable syntax, in some cases even embracing visual metaphors over verbal language. Clarity is key: hardly some enigmatic “code.”
“Code” isn’t helpful to coders, either. Software is ultimately translated into a processor-specific code, but developers do our best to stay as far away from that code as possible. Those same developers understand that leaving clear, human-readable instructions for one’s team–if not one’s future self–is at least as important as what they’re telling a computer to do. But “code” isn’t human-readable, let alone clear, and while calling it “coding” (see also: “hacking”) doesn’t necessarily excuse incomprehensible drivel, it doesn’t exactly discourage it, either.
How would we talk if we forgot about code? We’d go back to being programmers, software developers, and computer scientists. We’d program, write software, and build apps. We’d connect our work with its end-product, and maybe in the process make it that much more relatable. There will still be jargon after “code” is gone, but we’d no longer couch our roles in language meant to excuse and exclude.
We can do better than code. We can pull back the curtain dropped by our self-aggrandizing jargon and connect the banality of our daily routines — open the text file, write instructions in the most human-friendly language available, close the text file — with those familiar, delightful apps. We can dispel the mystique by sharing our work and encouraging others not just to use their apps but to maybe give software a try. And we can celebrate legibility over whatever opaque cleverness (“professional negligence”, call it what you will) their coder may have hidden within.
It’s a bitter irony that an industry grown rich on “open” source suffers a miserable reputation on inclusion and diversity. But as software developers it’s up to us to help welcome everyone into the fold. To the million-odd steps that each of us can take to start making things better, let’s add one more.
And stop calling it “code.”
Originally published at rjzaworski.com.