The anxiety of algorithms
Algorithms are funny. But in a way, not really new. That is, if you consider algorithms loosely as something that has a set of variables that determine a certain number of outcomes, but you either don’t know what those variables are or what the outcomes will be (usually it’s a combination of both).
So in a way, it’s like Nature, tribes and the concept of ‘other people’.
Since the beginning of recorded History, humans have been a tribal bunch. Social by nature, yes, but tribal at the core. And tribes, by definition, fear what’s outside of their borders. Sometimes it’s a tiger. Sometimes it’s a sandstorm. Sometimes it’s another tribe.
This fear of what’s outside is a self-preservation mechanism. You trust what you know — those around you — and what you can control — the variables that determine what happens around you, such as food production techniques. If you can’t trust nor control, anxiety kicks in. For ages, that’s what led our species to label unexplained phenomena as acts of punishment from God. External things need explanations in order to be bearable.
On another religious level, Zen Buddhism states that humans can only be happy once they become one with Nature. Which is to say, to recognise that to lose control to the elements can feel normal, even good. With obvious exceptions of course — I’d advise against applying Zen Buddhism during a tsunami or hurricane. But at the core, that’s the guiding principle.
Very few people successfully accomplish this. As much as I try to keep relaxed and feel connected to my natural surroundings and my breathing, the truth is that I always have something on my mind keeping me anxious. It’s in my nature, and many people are like this. It feels to me that there’s a parallel between this and trusting an algorithm to fully complete us as individuals and a collective.
Algorithms exist to manage and often summarise information that we can’t easily manage or summarise ourselves. Things like stock exchange data or indexing a new constellation or filtering out what matters on your Facebook feed.
On the one hand, we need them for this. Our brains are simply not built to process and go through tens of thousands of lines of code or spreadsheet rows or news stories to make a decision — all this in the fraction of a second. On the other hand, there is grounding for us to feel sceptical and even anxious — after all, we often don’t really know what such algorithms use as variables to determine an outcome, and therefore they instinctively feel like they’re not from out ‘tribe’. We can’t control nor manage them. We can trust them but trust becomes hard when you feel that a machine always has something to hide (even if not on purpose — for now at least).
If you think about it, the thing that makes today’s algorithms more anxiety-inducing is perhaps the fact that they’re not really ruled by people. Of course, people create algorithms, but once the point becomes to create automated mechanisms, you either trust (to some extent) what those mechanisms deliver back or you can never be one with them.
Compare this to the oldest news gathering algorithm there is — journalists, editors and news companies, whose job was indeed to filter the world so we wouldn’t have to — and the gap becomes clear. With obvious exceptions and political views aside, it’s still easier to trust the editorial criteria of the New York Times than it is to trust Facebook’s News Feed algorithm to determine what’s worth reading on a given day.
Which brings me back to the first point. (Tech) algorithms naturally create anxiety because they feel external, not of our kind, not something we can control, and therefore not something we can trust. Which is fine. But equally, if we’re to take a lesson from Zen Buddhism, then letting go of that control is a huge part of how we can best work and live with such devices.
I’m not saying it’s easy — FOMO exists for a reason. I’m also not saying that we should blindly trust machines that process all that information for us — fake news and purposefully misleading information make fact checking and doing your own research more important than ever. But if algorithms are increasingly inevitable to how modern society functions (and it sure feels like it), letting go of trying to control them feels like a natural next step.