We Exist Elsewhere

Vytautas Jankauskas
#getcached
Published in
5 min readJul 16, 2018

It all began whilst listening to the stories of people who worked in global economics around the period of the infamous 2008 financial crisis. Narratives beyond belief and reason ranged anywhere from bankrupt property managers, to striptease dancers with five villas on mortgage, to hermit economists who had foreseen the global meltdown, made a fortune, and retired to live somewhere deep into Swedish forests. Rumour has it you might still find them there. Many of the fun facts emerged around 2014, shortly before The Big Short came out. Its plot, a sort of a white-collar superhero rollercoaster, didn’t even appear as a hyped Hollywood blockbuster. Correlating with the stories I had been told, for me the film testified an interesting new reality. Simultaneously digital gamechangers such as Palantir, the Boston Shuffler and Co., Bitcoin, airBnb and Uber were gaining their momentum. Back then I was thinking — how could the entire market (by saying market in this context, besides the financial and the technosocial ecosystems, I cynically mean us, the Living) be entrusted to algorithms? We had just learned that we could barely (if at all) trust people, to run processes we had little to no understanding of. Hence, we collectively chose to shift our trust onto something arguably more efficient, less biased, less greedy, something superior, more rapid, omnipotent, omnipresent. That is, the black box.

From onwards I started obsessively noticing trust artefacts everywhere and in everything. Inspired by designer and academic Daisy Ginsberg’s doctoral research on The Dream of Better, investigating the human and, above all, the capitalist desire to constantly improve the existing, I started looking for public metaphors that would either seek for or promise trust. I remember passing by a newly opened shop for electronic cigarettes, with its slogan “vapers you can trust”. There was very little scientific data at the time to support such statements. Uber, despite its continuous scandals of rape, sexism, missing people that started emerging from the very early days of the company, still managed to maintain its audience through its “Safety First” and other campaigns. Blockchain promises to nail all the coffins, alongside other distributed ledgers, whilst millions of everyday people don’t know they’re using the Internet when they go on Facebook, and so on. At this point I cannot resist but shallowly invite the reader to think about the moments in which they have been publicly asked to trust something or someone?

When traditional roles, such as politicians, law enforcers, bankers, aid workers are suffering from the loss of trust, algorithms give an opportunity to transfer it somewhere else. We trust the Internet of Things to enter our homes, smart wearables to measure our health, cars to drive instead of us, google to answer our most intimate questions. Technology has enabled us to confidently enter an unknown person’s car, inhabit their apartment. Blockchain and crypto are promising to eliminate fake news, reshuffle wealth, even reinvent the creative industries. All that is required for technology to become (here we go) better at dealing with all of the above mentioned, is yet a little more of your data. It already has quite a lot, but not quite enough, or not always the right kind. For example, it can know if you have raped someone based on your Google searches, it can project your face onto a porn star having sex in a porn scene. It can also track how often you attend church and whether you are a strong believer; and based on that, it can calculate on how likely you are to return a loan. Technology has multiple ways of knowing what your home looks like, whether you’re currently in it, if you have pets. It can overhear your private conversations, now in several ways. Since that’s not really enough, it will soon see you naked, as you stand in front of a smart mirror. Would you trust a human being to know so many things about you?

Besides entrusting technology with our data, we also tend to build relationships in, through, and thanks to it. With my colleague Jon Flint, we have barely touched the ways users allow intimate aspects of their relationships to be mediated by algorithms in a recent project, Somestic Media. It is fascinating how present technologies and services determine how we maintain friendships, keep track of each other, and even fall in love. A design ethnographer and futurist Nicolas Nova recently outlined (in French) that if we asked people in the 80s to predict the future, they would have thought of flying cars, but not that we would use glass bricks in our pockets to hit on each other. Another interesting reference that has surfaced in our discussions with the Team Trust here at thecamp, is the Schopenhauerian “Hedgehog’s Dilemma”, which describes a group of hedgehogs trying to stay close to each other, to survive a cold winter, but forced to maintain distance as their spikes hurt each other. The common intention of reciprocity is disrupted by unavoidable limitation. In other words, the closer you are to another person, the more likely you will get hurt. We made a hypothesis that with the instantaneity of digital communications, such effect is further amplified. The smartasses we are, we then named it the “Sonic the Hedgehog Dilemma.” Surprisingly, it seems there is very little work that tackles the modern implications of Schopenhauer’s dilemma, apart from maybe this little rant (please, prove us wrong!).

And so here we are, out in the always connected — quite often FOMOed — digital world, in love, close yet distant, obsessed, at times happy, and finally, hurt. We trust all these feelings, encounters, relationships, and situations to be mediated through and by technology. However, we also tend to forget that digital technology is nothing more than boolean ones and zeroes, chunks of code. It is created by human developers who are often biased, and limited in their understanding of the social and moral consequences of the code they write.

During our project at thecamp, our goal is to visualise our leaking personal data, in relation to the saturated and idealistic image of ourselves we tend to present on social media. We also want to emphasise how we often carelessly project emotions onto algorithms, as we misperceive intangible networks as complex apparatuses for our relationships. Machines now possess a limited yet complex image of us. By exploring and prototyping ways to reveal and tangify that image to our human selves, we hope to create a more balanced opinion on algorithmic trust and the technological capacity to understand humans. My personal hope is that the project will eventually scale up to reflect on wider, urgent topics around data. For example, the refugee data security issues at the UN, or the building of interorganisational, digital trust. What we seek is informed trust rather than careless or absolute trust; After all, digital technology is unlikely to disappear from our lives. No matter whether we trust it or not, we have to acknowledge that we exist elsewhere.

--

--

Vytautas Jankauskas
#getcached

Vytas is an artist and designer, exploring digital and physical identities, and the technological transformation of the domestic mundane.