You can’t reduce all economic decisions to a series of financial bets, but it’s a good way to clarify things. Sometimes.
Unfortunately, in the case of COBOL, a sufficiently thoughtful analysis points in two equal and opposite directions: we either need to pay people to learn it or put them in jail if they try.
COBOL is a notoriously bad language. It locks programmers into a bunch of annoying conventions, it was written by someone who hadn’t spent hours retyping the same lengthy commands, it shoehorns users into a pseudo-natural-language approach, even though computers excel at things that can be describe in unnaturally formal terms.
What it has going for it is that it was used by some of the earliest companies that adopted computers. In other words, it’s used by banks. Banks have a pretty simple mandate: ensure that when account A gets debited and account B gets credited, the sum of the amounts in accounts A and B is unchanged. This makes them peculiarly sensitive to technical risk. If the software works, there’s a strong incentive not to change it.
Banks are also, notoriously, insensitive to tail risk. If you build a business model based on rounding probabilities from 1% to 0%, eventually you end up with a portfolio of negatively-skewed bets. In this case, the bet is that the last COBOL bug will be squashed before the last COBOL programmer dies.
G. K. Chesterton and the Case for COBOL
I’m a fan of G. K. Chesterton. As I’ve argued before, his view of traditionalism is an underrated business concept. The basic Chesterton theory is that, if you find some tradition you can’t explain, that’s not a reason to get rid of it — that’s a reason to keep it, since whoever invented it had a reason, and the collective wisdom of everyone in history probably exceeds your own.
A Chestertonian would argue that a COBOL-based system is perfectly fine. If you pointed out that nobody is using COBOL today, the weak counterargument would be that perhaps there is ancient wisdom we don’t have access to. A better counter, though, is that you’re defining “the system” all wrong.
To a consumer, it’s easy to treat “the system” as synonymous with software. I pay for MS Office, for example, and as far as I’m concerned, the system is the software. But most software products are not used by millions of people who pay $10/month; they’re used by a few people, or a few dozen, and the cost per user is much higher.
As the cost per user gets higher, the opportunity cost of providing support goes down. If I needed an hour of support each month for Office, that would obliterate margins. At the other end of the spectrum, if there’s some bespoke inventory-management system that was built by one person and is used by ten people, the tradeoff between product quality and support leans towards support instead. An error affecting 1% of Excel users means a million or more heldesk calls; an error affecting 1% of the bespoke product’s users probably won’t affect anyone at all.
But the economic effect of this is that customized products that require continuous maintenance are very hard to replace. You can reverse-engineer some of the things Office does, and build a clone or an interface to other products. But you can’t do that with a legacy COBOL product, because some sections of the spec are basically “Ask Alice about this, assuming she hasn’t changed her phone number since she retired last year.”
As the engineers age, you have two choices: replace the system, or hire successors. This sounds like a technical problem, because at one level it’s asking if we should use COBOL or Erlang. But it’s really a social problem: should we fire Bob? Should we start a project that will make it clear to Bob that his time at the company is limited, at the same time that he’s an essential input into our process? Bob, the Last COBOL Cowboy Standing, has a lot of leverage here. And, just statistically, someone who knows COBOL has probably been at the company longer than someone playing with spreadsheets to optimize EBITDA, and probably has more political capital.
So, every time someone runs the numbers on upgrading, it makes sense. Every time they think about the process, they decide to focus on something else. Of course, if this process is happening at other big companies, there’s more social pressure: COBOL is good enough for our competitors; practically a best practice! Why can’t it be good enough for us?
You, reader, might be the kind of person who objects to this thinking: who cares if it’s politically difficult, who cares if it’s contrarian? We know the right decision, and we know it’s risky to make the wrong decision — not in any given year, but eventually, catastrophically risky. If you feel this way, consider not working at a big bank.
The original Chesterton quote is about the virtues of traditionalism, but the folk-Chestertonian attitude is actually more stick-in-the-mud than GKC himself. Chesterton conceded that once you know why a tradition existed in the first place, you can get rid of it. But the institutional incentives of legacy software systems push back against even reasonable updates. To return to Chesterton — and the guy really does have a quote for every situation — updating ancient legacy software “has not been tried and found wanting; it has been found difficult and not tried.”
Hence the case for subsidizing COBOL schools: the incentives for maintaining legacy software are opaque until you’ve been at a company for a while, so getting good at old technologies is the kind of thing 40-year-old middle-managers wish they’d done at 20.
Tontines and the Case Against COBOL
Regulation is hit-or-miss, but sometimes we get lucky. There’s a sort of securities law edge case that could force everyone with a legacy software system to upgrade to something more cutting-edge, internal politics be damned.
I’m talking, of course, about the possibility that learning COBOL is a tontine, and thus illegal.
A tontine, for those of you unfamiliar with the literature, is a financial instrument that works like this:
- A group of people pool money (or, in one case study, stolen art).
- When all of them but one have died, the survivor gets the money.
If a hundred people invest ten thousand dollars apiece in a tontine, it’s a nice retirement plan in the event that one of them lives an unusually long time. Once there are only a handful left, it’s the setup to a Coen brothers movie.
Naturally, there are legal restrictions on tontines; in many places they’re banned, and in the US they receive a fair amount of preemptive regulatory skepticism.
Think of the cash flow profile of someone who learns a legacy programming language: when there are lots of users, we’re close to economic perfect competition, and wages are determined by supply and demand. As people retire or die, the supply gets restricted, and the remaining suppliers can coordinate their efforts. In an extreme case, they can charge monopolistic prices. To companies that rely on these legacy systems, the market-clearing price is surprisingly high — remember, every swipe of a credit or debit card is likely to set off a chain of interactions with at least one link written in COBOL, so the entire financial system depends on this stuff continuing to work.
Minimal payoff at first, enormous payoff as people die — obviously, the financial profile of COBOL perfectly matches that of a tontine. And we ban tontines specifically for the incentives they create; if you recreate the same payoff structure using a different mechanism, the case for banning it is as strong as it ever was.
Regulations are useful when individual actors don’t have an incentive to mitigate negative externalities. Legacy software creates negative externalities. There is not a good case for leaving things as they are, so we must subsidize or ban teaching people legacy programming languages as soon as possible.
 My understanding of GKC only deepened when my kids got old enough to ask me to justify my decisions, such as my argument that we shouldn’t have cookies for breakfast.