It’s become a cliché you hear at nearly every tech conference (and lots of non-tech business events, too): “Software is eating the world”. It was a provocative framing by Marc Andreessen (and pretty effective marketing for his venture capital firm, Andreessen Horowitz), but it missed one of the most important parts of the story.
Yes, nearly every industry is being transformed by the power of technology. But it’s not just business processes that get changed — it’s the way workers are valued, the economics that shape industries, even the basics of how we communicate with each other and learn about the world. Software didn’t eat the world: it bent the world to fit the values of people who make software. And not everybody was happy with the result.
In programmer speak, there’s one big question to answer: when software ate the world, was that a bug or a feature?
Find All Positive Values
The good news is, there are many positive values shared by most people who make software. (We’d hope so, since that’s a group that includes all of us at Fog Creek Software!) At risk of drawing overly broad generalizations, tech workers typically value creative expression and personal autonomy, they’re often skeptical of legacy institutions that have become unresponsive to their constituents, and they have a deep, earnest and abiding optimism that even the biggest problems can be solved.
But that’s not the whole story.
Amidst the current global rise of populist movements, we’re seeing the first widespread backlash against tech since the dawn of the internet era. Some of it is basic economics — tech people got rich at a time when not a whole lot of other people did, and some of the ways they got rich have started to feel like an ugly surprise. People are deeply worried about the effect tech has on their privacy and security, and on their jobs and the economy. That’s to say nothing of the cultural shifts wrought by ubiquitous connected devices and social media.
A lot of that public mistrust can be attributed to some of the more negative tendencies in tech culture. In all things, our best traits can sometimes be our worst weaknesses. Tech is no exception.
That love of personal autonomy can shortchange a sense of collective responsibility. That skepticism about institutions can yield a “disruption” where imperfect systems are replaced by no system at all, or one where only those making the tech benefit. That optimistic belief that every problem can be fixed sometimes leads people making tech to think they’re the only ones who can fix things. And all of these problems are severely exacerbated by tech’s persistent failings at inclusion, which means these problems affect marginalized communities even more acutely.
Fixing Big Bugs
One of the most common tasks in making software is simple: fixing bugs. Historically, we think of “bug fixing” as relatively straightforward stuff—maybe your app doesn’t look exactly right in that one user’s web browser. In the worst case, maybe it’s doing some calculation wrong, and you’ll have to update the software.
When our company, Fog Creek Software, was started as a little indie firm way back in 2000, we mostly saw bugs that way, too. We made a bug-tracking app and tried to help people make sure they were fixing what was wrong in their software.
While that was happening, our cofounder Joel Spolsky also wrote a lot about the culture of making software. Back then, at the height of the dot-com boom, it was seen as a bit eccentric to put as much focus on the human factors and ethical behavior as our founders did. But it helped us win fans, and some of those people tried out the various apps we built along the years, and we’ve been lucky to keep thriving as what feels like one of the last few independent tech companies that’s still relevant.
But we missed something important, too. Those ideas and insights about how to treat people, how to listen to customers (and to communities), and how to be thoughtful and responsible in creating technology were even more important than anything we built into our software. They were the first steps to trying to fix what we could now think of as “Big Bugs”. Little bugs were mistakes in the software. Big Bugs are when we exacerbate (or cause!) major problems in society.
The Bite of the Big Bugs
What do we mean by “Big Bugs”? Software that exacerbates racial biases in the criminal justice system is a big bug. Security policies that put sensitive data from hundreds of millions of people at risk are a big bug. Apps that secretly spy on users (including, yes, Beyoncé) have big bugs. Undermining trust in legitimate journalism and exacerbating fake news? Yep, that’s a big, big bug.
So it’s time we make sure that we prioritize these big bugs right alongside the more obvious ones. Yes, check if your site looks good on an older smartphone, but also be sure that your data policies are respectful of your users. Of course you should address that persistent memory leak, but make time to improve password storage practices as well. And to be clear, we’re not pointing fingers here — we’ve been just as guilty of many of these systemic issues as anyone, focused on the “classic” definition of bugs while shortchanging our role in fixing some of the really Big Bugs.
Maybe it seems presumptuous for a little software company to point out these flaws in a giant industry, or overly optimistic to think that our little community can make a difference in changing tech culture overall. But we saw it happen before, not that long ago, when the first wave of people blogging about and thinking about software online started to connect with each other. They pushed the state of the art forward with thoughtful conversations about design, accessibility, web standards, performance and so many more topics that we now take for granted as part of our checklist in building our apps.
We think it’s time that a new generation of coders tries to tackle this even more important set of issues around access, equality, equity and basic fairness. And the clearest way we can state it is very simple: Software matters.