“Learn to Code” is Strictly Better Than “Tech Bros Should Learn Humanities”
One of my annual traditions is to alternate between trying and failing to learn Haskell and trying and failing to learn Scheme. It’s a comforting cycle. Both languages are, in different senses, very pure: a program written in either language is also a succinct expression of exactly what that program is for.
Older programming languages are basically a human-readable notation for what a computer does, but the purest ones are a human-readable notation for what the programmer is trying to accomplish. It’s not a coincidence that Scheme and Haskell come from academic theorists. Python, the language I can actually accomplish things in, is descended from a language meant for teaching.
Python is fine, but it makes me feel guilty. It’s like reading a work of pop history when Herodotus is sitting right there on the shelf. Glaring at you.
So, periodically I get very ambitious and crack open SICP or start reading about type theory. At some point, I lose momentum and switch from trying to do things the right way to actually doing them the easy way.

A few years ago, in a Haskell phase, I signed up for a reading group.that was working through Haskell Programming from First Principles. Great book, great group. And one of the surprising things about this group was that, while the attendees were almost all CS students or professional programmers, the host was an academic. He’d picked up Haskell for fun while getting a Masters degree in English.
This struck me as a surprising counterpoint to the narrative that the tech industry suffers from a paucity of humanities. There are really only two problems with that narrative:
- Tech people, in my experience, do in fact have more exposure to high culture than the average person. Toting around a copy of The Power Broker is practically de rigueur in some circles; Zuck fired up the FB team after Google+ launched by quoting Cato; Bill Gates always took stacks of books with him when he went on vacation while running Microsoft; every designer knows who Charles Joseph Minard was; Paul Graham’s e-commerce startup was a pivot from his career pivot away from painting; Andreessen “often coded alone at night as he listened to opera at full blast”; Marcus Aurelius and René Girard are trendy enough that people reflexively roll their eyes; Google put basically every book published at least a century ago on the Internet for free; and Amazon started out by making every other book available on the Internet for cheap.
- Meanwhile, liberal arts people critique technologists by saying, more or less, “Building a product is trivial; you need to do the hard work of grappling with history and the human condition or your company will suck,” and yet these same people mostly don’t. If you looked at a programmer’s website and you saw a link to their favorite books in the sidebar, you’d think nothing of it; if you looked at a poet’s website and saw a sidebar link to their favorite Github repos, you’d be shocked.
The standard pro-humanities riposte to this is to say that even if tech people have some superficial knowledge of history and literature, they don’t really get it. They’ve skimmed, but they haven’t absorbed. And how do we know? Because Facebook and Google and Amazon aren’t run the way we imagine we’d run them, if we were in charge.
What can you say to that? There’s a pretty good test of whether or not a technology company is successful, and it goes like this: did the founders have a lucrative exit? There isn’t a good test of whether or not the company is good, in a deep moral sense. The best answer we have is: wait 200 years and see if historians have reached a consensus yet.
In a certain meta sense, the critique of tech-as-insufficiently-marinated-in-humanities is a rejection of the humanist outlook, which tells us: a human being is complicated, and can’t be judged superficially. You have to engage in some painstaking, agonizingly difficult psychological surgery to tease out real motivations and constraints. Some heroes have flaws, some villains rise to the occasion, but a Cliff’s Notes summary misses the point entirely. Zola worked in a mine; Sinclair worked at a meatpacking plant.[1] If you really want to understand technology companies, you have to spend some time undercover.
But it has to go in that order: wonder what the role of technology in society is, wonder about the nerd protagonists of the present point in history, and then learn enough C++ to get a job at Google. There are people who do it the other way around: some time in tech, then some time outside of tech as critics.
Results: uneven.
Antonio Garcia Martinez wrote a very cynical memoir of his time at Facebook[2], and now writes fine meta-cynical commentary about tech news shibboleths. David Auerbech wrote a good memoir about programming and an incredible piece on the AIM/MSN duel for market share. Mike Judge’s Silicon Valley is influenced by the time he spent working in Silicon Valley.
But more recent tech-people-cum-tech-commentators are harsher. Anna Wiener’s memoir of working in tech is mostly a story of modern urban anomie.[3] Whenever someone angrily exposes the secrets of a field they’ve been involved in, the first question to ask is whether they’re more motivated by truth-seeking or ressentiment. That’s especially important in tech, where the power-law distribution of success, and the stochastic survival function of early-stage companies, means that a) almost everyone does worse than average (as long as by “average” you mean “mean”), and b) since failure is so random, success looks random, too.[4]
So the tech/humanities culture gap is a sort of yin-yang of mutual incomprehension: any humanities PhD dropout can argue that a programmer naming her new software library “Anabasis” or “parthian_shot.py” is just LARPing as an educated person. Meanwhile, anyone in tech can compare the top salary at Wired to entry-level comp at Facebook or Google and say “You’re just jealous.”
The tiebreaker is to look at behavior. Not everyone in tech reads ancient history or Jane Austen, but enough of them do that it’s not especially remarkable. Whereas it would be remarkable if someone decided that social networks and search engines suffer from a serious deficiency of humanities, and built their own competitor in response.
The critiques have very different tones: “Learn to code” is an exhortation, it tells someone to find a new means of self-expression, and a new way to provide things other people in the world value. “Learn to code” means “Improve yourself.” The converse critique is not a suggestion to be a better version of yourself, but to be a different self, with different moral priorities. It’s the difference between calling someone fat and calling them short; both are mean, but only one can be cruel. And calling someone fat has a sort of natural law justification; we’re not meant to be fat! Animals mostly get fat in captivity, or when something goes wacky in the local ecosystem. And everyone can tell exactly what species Vitruvian Man is supposed to be.
“Computer nerds are dangerous, because they don’t think like me!” What’s easier, changing the fundamental way someone thinks or picking up a new skill?
What’s the worst-case scenario here? For a startup founder, the risk of embracing the humanities is that the time you spend meditating on The Glass Bead Game while your competitors stay up late building new features to kick your ass at the glass bead game of customer acquisition. But the second-place prize in tech is more money than anyone needs, unless one of your needs is to save the world.[5] For a frustrated grad student, your worst-case scenario if you learn to code is a cushy upper-middle-class job — maybe a morally-dubious one, but in an economy with complex supply chains you’re guaranteed to get your hands dirty no matter where you work unless you form a vegan commune.
Humanities-focused tech critics face two real risks: the moral, and the practical. Morally, they might find that the Manichean view of the tech industry is terribly flawed, that every organization has to balance different moral concerns and will always end up falling short somewhere even if they legitimately pursue what they believe to be the best outcome. And practically, they might discover that it’s a lot easier for a programmer to understand literature than it is for a writer to learn to code.
Don’t miss the next story. Sign up for my occasional email newsletter. Or check the About/FAQ page for more.
[1] He didn’t work there very long, but a friend of mine told me some stories about his stint a couple years ago in the same industry, and seven weeks in a slaughterhouse sounds like a lifetime’s worth to me.
[2] I kept track, and in the entire book there are only three things he has a uniformly positive opinion of: casual sex, Belgian beer, and Michel Houellebecq.
[3] The article goes to odd lengths to be specific without identifying anyone by name. “A renowned seed accelerator in Mountain View had offered funding and connections in exchange for a seven-per-cent stake” Who could it be!? “A technology conglomerate that first made its reputation as a Web-page search engine, but quickly became the world’s largest and most valuable private repository of consumer data, developed a prototype for a pair of eyeglasses on which the wearer could check his or her e-mail…” Is it AltaVista!? To be fair, when she describes someone as “a curly-haired twenty-six-year-old with a forearm tattoo in Sanskrit” she’s only narrowing it down to roughly 15% of the population of the Bay Area.
[4] Success is less random than failure because there are so many things that can go wrong — bad cofounder, miss-timed launch, wrong tech stack, a single bad partnership, waiting too long for funding, raising too much, etc. When everything goes right, you win, but good judgment has serial correlation. There’s a near possible world where Facebook failed, but it’s still a world where a real-name-based site with a fanatical commitment to uptime and clean design killed MySpace and Friendster.
[5] Elon Musk, for example, is probably capable of starving to death on two billion dollars a year. Getting to Mars has a higher burn rate than that!