Algorithmic Superstructures

Last weekend, I flew to the beautiful city of Utrecht in Holland, to attend ‘Algorithmic Superstructures’, a conference organised by critical media centre, Impakt. There are a number of videos from the event online, and I strongly recommend having a look.

The festival incorporated panels, talks, films and an exhibition at another site, and seemed to attract data doyennes from all over the world. So, what is it about algorithms in particular, that is capturing the imagination of creative thinkers at the moment? Elegant, but morally ambivalent, it may be that they’re a handy modern shorthand for an eternal fascination: mysterious forces lying slightly out of the reach of human perception. Or is it that this finicky, almost administrative emblem is attractive at a time when we are looking for something to ease anxiety and restore order? “Things seem chaotic right now, maybe flowcharts can make us feel better”. One thing was immediately clear, though, looking around: there is no single algo-culture, but many different ideologies and perspectives, each with its own ‘algorithmic’ take on the world.

A Generation of the Algorithm?

The attendees of Algorithmic Superstructures were relatively young, and I wondered whether, rather than that loaded term ‘Millennials’, we might usefully talk of a generation of the algorithm. I’m keen to detach algorithms from tech, here, because nothing comes from nowhere, and the seeds of digital culture(s) were planted in the generations before computers were invented. Also, the more you interrogate the idea of ‘digital’, the more the term seems to dissolve into an arbitrary mixture of ideas no more unique to computers than they are to a human body or a star system or a plate of mashed potato.

That said, one unique consequence of the digital era is that we now have millions of young people dealing constantly in invisible or abstract materials. Their communities are shaped by characters on a glowing screen; their destiny is defined by neither the priapic excitement of exploration nor the dramas of war that shaped their ancestors’ lives. In the past, futures could be ‘won’, usually by ‘heroic’ white men, casting a ray of mythological glory down the ancestral timeline. But this algorithmic generation finds itself abstracted from history. The raging individualism of the recent past is suspect, at best, as these people have inherited its consequences: a social and environmental climate in crisis. Born into a world already apparently hurtling somewhere, without brakes, they want to work out what’s laying the track beneath them.

Destiny’s Children

The motif of the algorithm fits neatly with these questions of free will and power. Whatever age we are, we wonder how many of the problems in our world were set in motion years ago, how many are we creating, how many create us? And we imagine that we can’t see the secret processes but those with power over us can. The people who control what we crosses our paths online, for example, must surely have insight. Mark Zuckerberg understands the underlying technological processes, and has some secret knowledge about their impact, right? In truth, of course, our tools operate at a very great distance from our understanding. No one knows what the hell they’re doing.

Amidst all this confusion, the algorithm has emerged as a useful shorthand for all kinds of things: digital systems, justice, networks, formula, danger, rights and wrongs. Even at the level of predictive text, it somehow transmits a sense of the unexplained, like something with agency is going on, just out of reach.

Rachel Coldicutt, CEO of digital culture thinktank, Dot Everyone, wrote a great piece recently about data:

“Data and connectivity are ubiquitous, invisible, and overwhelming. It’s worth remembering that the word “data” was originally a plural, as in: there are too many data, they are too hard to see, and their consequences are too difficult to contextualise. Data are not legible in the real world and — unlike other unexplained phenomena — there aren’t any myths and legends that make sense of them.”

As an artist who has completed numerous residencies about AI, I am fascinated by the idea we might have myths and legends to give shape to our invisible digital entities. I don’t feel compelled to explain or illustrate received technical ideas through the work I make, though. Here’s where technology and science often miss a trick: art is a lot more than a drawing of something somebody else says is real, a mouthpiece for another’s voice. Art has its own agency, and can lead collaborations. It shows us ways to make connections and imagine new scenarios and futures. It gives us the opportunity to imagine and visualise new power structures. This is, of course, only useful to those not benefitting from the existing ones. All things considered, then, it’s not too surprising that there was a healthy contingent of neo-Marxist thinkers at this festival.

The luxury of abstraction

I attended an informal round-table chat with the curators on the Saturday morning, and the title was an immediate talking point. While ‘superstructures’ has a specific, Marxist, meaning, it seems this event rings in an updated interpretation, for the modern digital age. Algorithms are of special interest to today’s young neo-Marxist because they obfuscate the location of power, apparently.

There was some discussion around the culpability of the media and PR, and the ethics of using the amplification privileges we have, to spread information and make more people aware of what’s going on under the surface of their digital world. Someone wondered whether the extreme comfort we experience in the West is handicapping us from actioning social change. We are good at abstract chat, but lack the physical pressures required to activate our ideas in practical ways.

This had me wondering whether we could think in terms of a ‘luxury of abstraction’; the tremendous privilege that empowers some of us to separate things from their physicality, locale or meaning. At one end of that scale we’d have the new freedoms engendered by basic literacy, at the other, I suppose, a vision of our future immortality as some kind of floating holographic head. And while sometimes this kind of second-level thinking gives us insights, at other times, well — we all know what happened in The Emperor’s New Clothes.

And that’s the problem with abstraction, it needs to sell itself to imagination in order to work at all. There’s no such thing as a spontaneously-arising ‘collective imagination’, so politics steps in to tune everyone’s set to the same image. Maybe the thing most obviously detached from its origin, disconnected from the rigors of physical world and abstracted from both its past and its future, is this algorithmic generation itself.

Sophie from Tactical Tech was there. She wondered what the role of artists might be, in creating these new ‘cultural imaginaries’. Curator Yasemin Keskintepe replied that artists are becoming more hybrid, absorbing roles as academic researchers, language bridgers, translators of political and tech ideas into a single work, “that allows discussion to broaden and democratise”.

There’s an interesting observation in there about arts in general becoming the ‘everything else’ place. Another way of thinking about that: the arts are in everything, now, in the way that digital is. We are all somehow curators, creators, and technologists. This ran through the whole weekend, for me: we all have to be everything, now. Our schedules and identities are an unprecedent muddle. No wonder we’re interested in algorithms.

Artist Kevin Lee took things to another level still, wondering: has AI, or tech, now acquired enough momentum that it might be considered to be wanting something? He also suggested major historic moments were artful: terror attacks; Trump. I took these ideas as a poetic way of saying that art has agency, and where there’s major agency, there’s often a kind of art.

Trust me, I’m some kind of hacker

A preference for humans over tech was a running theme of the weekend. It was there in the discussions about escaping the tyranny of screen-based devices, and it was there in the idea that deploying hundreds of humans to fact-check news stories was a far more reliable and sophisticated method than using AI.

It comes back to what we’re happiest putting our faith in, and I attended a panel called ‘(Infra)structures of Trust’ that covered a lot of ground from a civic/cities perspective.

The fascinating Arnauld Castaignet, Head of PR for the Republic of Estonia’s e-Residency programme, explained how Estonia’s community have been empowered through digital. Did you know there are just three things Estonians can’t do online: marry, divorce, and buy property? Arnauld said: “Infrastructures can contribute to building trust in institutions”. Putting things in place doesn’t mean they’ll be used, though; only 35% of Estonian citizens bother to vote.

Jaromil, of not-for-profit software foundry Dyne.org, was great fun. Nowadays, he says, we don’t use algorithms — they use us. He had some great examples of AI training games (like more extreme versions of Google Quick Draw) that effectively use game-playing children to develop military weapons. Jaromil’s big thing is about restoring ‘sovereignty’ to the user. As soon as something seems to be empowering the user, he said, it’s hijacked — citing the way Trump has steamrollered Twitter. Lack of diverse voices is a threat to truth (he recommends this book), and we should ask ourselves ‘Who really benefits?’ from the flashy modern services cities offer — airbnb, Uber, etc.

Bellingcat, Bellingcat, what are they feeding you?

A highlight of my trip was the packed-out talk by a chap from Leicester called Eliot Higgins, founder of online investigation community Bellingcat. This mysterious volunteer-run organisation turns heavy news stories into a kind of extreme game — spot the difference, find the truth, perhaps save some lives.

Bellingcat weaponise the scale and intensity of their community, helping law-enforcement organisations to identify people, dates and places from details in the backgrounds of images, and raking through the hinterlands of social media. Among other accomplishments (and probably many they can’t talk about), they have uncovered the identities of the Salisbury spies, analysed military information about bombings in Syria, and located missing people from the details in holiday snaps. Everything about this organisation is extraordinary, including the mere existence of it. They are funded largely by donations and the workshops they run on a regular basis, arming people with a kind of truth-checking toolkit that future generations will presumably have implanted into their spinal columns at birth.

Elliot works with technology, of course, and collaborates with high tech specialists like Forensic Architects. But perhaps more than any of the other speakers, he reinforced the preference for humans over machines. Working together to a common goal, humans are considerably more effective than AI at almost everything — more sensitive, faster, better at operating as a ‘superstructure’. A computer’s pattern-matching turns up so many false positives it has to be repeatedly checked. Eliot would rather work with people than algorithms, and as long as people are this intensely invested in understanding each other, I can see why.

You can watch Eliot’s talk here.