Whig Algorithmics

In writing about the fetishization of algorithms yesterday, and its implied values of disconnection and social withdrawal, I linked to Om Malik’s recent article in the New Yorker,

“In Silicon Valley Now, It’s Almost Always Winner Takes All”

I didn’t have this link in mind as a reference when I began to write, but I came across it on Twitter as I was drafting the original version of my comment. I read it, it seemed relevant, and I included it without a second thought. Coming back to it this morning I wonder if it isn’t even more appropriate than I had originally guessed to the larger point at which I was gesturing.

Certainly Malik’s essay is unconvincing to me in several ways. For starters, I think he asks the phrase “network effects” to do too much work, and that any such effects in play for a company like Facebook will be very different than those that serve to entrench Google or Amazon or Uber. Similarly, I think it’s strange to sprinkle the word “algorithm” around like a kind of fairy dust: which algorithms are we actually talking about? And why do we think they give any of these companies any kind of advantage? Finally, he treats “monopoly” as if it’s some kind of eternal end-state for any successful business — whereas I would guess that, at the very least, both Google and Facebook are what someone like Albert Hirschman might have termed “lazy monopolists,” monopolies that are “more durable and stifling [as they are] both unambitious and escapable.”

But all of these are just minor quibbles; I believe the real failure, and the real danger, of the piece occurs at its very end when he writes,

… this is part of the technology cycle. Google, Facebook, and, perhaps, Uber are indicators of something bigger: in our connected age, data, infrastructure, and algorithms give companies a distinct advantage. … it is a winner-takes-all world.

Here we’re seeing something like his actual point: all these companies, these examples, are just symptoms of “something bigger.” He’s pulling back the veil and inviting the reader to catch a glimpse of the hidden reality.

I think there’s a natural inclination, among people who find themselves in possession of a remarkably effective new tool, to read in to that tool your personal dreams and values. Cars are becoming cheaper and faster? Well then owning a car must be a part of the American dream! Digital cameras are now a reality? Maybe our cities will be safer when we put those cameras on every street corner! Computers have gotten smaller and more powerful? Let’s put a chip in every appliance in your home, and you can let your refrigerator shop for you automatically when you’re out of milk!

Algorithms are an incredibly powerful tool — possibly a universal science of tools — so it shouldn’t surprise us that when people discover algorithms, they tend to go a bit crazy. Here is an idea that might rationalize our political system, replace our bosses, discover new scientific truths, diagnose disease, write new books and music, translate foreign languages, assemble a universal library of human knowledge, and modestly reduce the severity of traffic jams: basically everything in our lives which irritates or annoys us but over which we exert very little control.

However a significant danger to investing this kind of meaning in a new tool, and I think this is the danger fully on display in Malik’s essay, is when we begin to interpret it as an efficient cause for the world we see around us. Algorithms are no longer a process or a means to an end; rather they come to represent a universal explanation for any successful person, pivotal decision, or profitable business. If Google, Uber, Facebook and Amazon all “use algorithms,” and if they are “winning,” then (goes the story) they must be winning because of algorithms. It follows that every business should use algorithms, and better use of algorithms is the source of winning, and the winning will be a complete victory, the victor shall become an all-powerful monopolist, Q.E.D. forever and ever amen.

This human tendency, to see the march of history as an interpretable thing, divided into identifiable cycles, directed towards valuable goals, reflective of simple underlying principles, all confirming exactly the thing that we had hoped was true from the outset, is both powerful and natural — natural because it invokes outcomes we already know to be true, and powerful because it appeals to explanations we want to be sufficient.

But in the end, this kind of reflexive recourse to whiggish tech history (maybe call it “post hoc ergo propter hoc for postmortems”) is complementary and reinforcing to the algorithmic cargo cult that seems to seduce so many programmers and permeate so many startups. For every Jeong who writes about the destructive values implicit in algorithmic culture there is a Malik, who stands up and suggests that algorithms are a crucial component of existing, successful business and culture.

I hope that if we, collectively, choose to head down this algorithmic road — if we embed these values into our society and business and personal lives — that we do so with our eyes open, and not because of a generation of disaffected young programmers decided that “culture” was hard, read an essay like this, and convinced themselves it could never have been any other way.