The past controls the future

Alexander Lange
equilibre
Published in
7 min readOct 28, 2019

Surveillance capitalist feedback loops, enslaved digital twins, human rights, optionality.

>> first published on https://svrgn.substack.com/, join us there. I will share research, thoughts and analysis around freedom technologies every 2 weeks or when I have sth. to share, no spam.<<

The web didn’t turn into the utopia we were hoping for — a place where everyone has access to information, global services and can openly communicate with the world at almost zero cost and in private. Instead we see global misinformation campaigns, gate keeping, censorship, surveillance and the hoarding of mass behavioural data by corporations and governments limiting humanity’s ability to learn from it collectively. Let’s explore how the surveillance capitalist feedback loop works, how we gave up our freedom, why this is a human rights emergency call and what we can do about it.

The surveillance capitalist feedback loop

The framework below lines out a series of steps which are performed by all sorts of algorithms running a variety of applications we are using everyday — from newsfeeds, to music recommendation, health treatments, search results, finding a partner or any other online activity.

Let’s dissect it.

Input can be any form of data stream measured by a ‘sensor’ of any type, e.g. IoT devices, cameras, smartphones tracking our geo-location, sleeping activity, health metrics, search history, websites surfed, transaction histories etc.

Monitoring is the second step where every piece of information the sensor receives is stored in in a database creating a permanent record of every thought, action or transaction undertaken by us. The data is sold continuously and mostly programmatically with our explicit consent but without our actual understanding. South Park put it very well here. Being watched isn’t that big of a deal, nothing to hide…? Being under constant scrutiny of a 3rd party deeply alters human behaviour as the watched are unconsciously complying with the supposed social norms of the watcher.

Source https://paulcairney.files.wordpress.com/2013/10/panopticon.jpg

In the 18th century psychologists experimented with a new type of prison. The Panopticon is a ring-shaped building with a watchtower at its centre that has blinded windows. At any given point in time the inmates felt a sentiment of invisible omnipresence. Historians described it as “a device of such monstrous efficiency that it left no room for humanity.

Through the commercial web we might have created a panoptic device for our minds.

Profiling is the next step undertaken by the surveillance machine. Age, gender, political orientation, sexual preferences, wealth and social status are becoming visible. Since Cambridge Analytica we know that from our online behaviour conclusions can be derived about our most intimate secrets — our psychological profile, emotions, desires, fears, vulnerabilities.

Source: https://www.cbinsights.com/research/what-is-psychographics/

Imagine any authoritarian regime in history had access to such tools in the past. How long would it have taken to identify all potential political dissidents, jews, curds, gay people or any other minority in a country? A few minutes maximum.

Prediction of behaviour is what comes next, literally. Based on historic behavioural patterns conclusions can be drawn regarding future actions or facts which might not necessarily be linked to the data at hand. In 2012 Target figured out a teen girl was pregnant before her father did — based on a subtle change in her purchasing behaviours. Another interesting field are pre crime initiatives. You get the idea.

Source https://www.geeksofdoom.com/2015/12/01/tv-review-minority-report-1-10-everybody-runs

Manipulation of behaviour at scale in a highly automatised manner is the holy grail of advertising, political campaigning and ruling the world. Based on a person’s psychographic and knowledge about her desires and fears content can be programatically customised to trigger the desired action — purchasing a car, voting for a demagogue or maybe conducting an act of violence. Election hacking has only been the tip of the iceberg.

Amplification is rather an effect of the previous steps as opposed to an action executed by an algorithm. The more actions we take, the deeper our personal registry becomes and the better the machines can guide our behaviour. More data > better algorithms > more data > better algorithms — it’s a feedback loop benefiting organisations that already possess vast amounts of proprietary data and fine tuned algorithms. Data network effects.

The enslavement of our digital twins

‘Slavery is a the condition in which one person is owned as property by another and is under the owner’s control, especially in involuntary servitude.’ — https://www.thefreedictionary.com/slavery

By leaving traces of everything we do online we are creating a digital twin, an exact representation of ourselves. That twin is not some abstract concept — it is the core of our personality. It might represent us even more precisely than our physical self. It tells the world more about our emotional world, psyche, social status etc. than anything we could tell another person by using words. It is the purest representation of ourselves in the form of raw data. It lives inside siloed data bases that are controlled by corporations and through them — governments. It is legally owned by third parties, just like a slave.

If those corporations or outside attackers decide to tamper with our digital twins that will have severe implications for our future — think of criminals records, credit scores, associations with dissidents or criminals, dark web surf histories, online impersonations. A low ‘score’ won’t get you on board of the next plane. It won’t get you the job or flat you’d like to have. Our future windows of opportunity depend on our digital past — it’s a path dependency. Our ability to break out of the norm, to evolve as individuals, to get a second chance and to achieve something statistically unlikely is at stake.

Source: https://fortune.com/2018/10/28/in-china-facial-recognition-tech-is-watching-you/

Many were complaining about human rights violations with regard to China’s social ranking system. I don’t see any meaningful difference to our own digital reality. Or to use the words of Edward Snowden — ‘if China does it, it’s very likely we do it as well’.

The past controls the future.

A dark picture

Yes, the picture seems dark. Of course there are tons of positive counter examples — e.g. predicting future health issues of a person and providing help in due time or preventing a crime by having some policemen showing up at the right place and time based on prediction algorithms. Technology is neutral and can always be used for good and bad.

But that is not the point. The point is that we are witnessing unprecedented power asymmetries without any democratic legitimation or even public debate.

Better governance and optionality

To defend our human rights in the future we need to take action. Now. but how?

Opt 1) Opting Out: We could opt out of systems that take our digital twins hostage. But do we? Stopping to use a smartphone or laptop to replace it with a Nokia 8210? Back to the primitive! Historically, the group with better technology always won.

Opt 2) Political Activism: public interests barely have a strong lobby behind them unless a critical mass of ca. 5–10% of the population is reached. There are some very few organisations fighting for human rights in the digital realm such as EFF or Coincenter for example. We need more of that and we need resources.

Opt 3) Alternative freedom technologies: We could opt into freedom technologies that help us defend our rights. We could encrypt our mails with PGP or Protonmail, use the tor browser, use privacy preserving instant messaging services such as Matrix / Riot or Signal, use privacy preserving crypto currencies such as Zcash, Monero, Grin or others. [note: the author is holding small amounts of those assets], use DuckDuck GO instead of Google. But could we?

Not really. Not anymore. Not yet. It’s complicated.

Most of those services are either a) horribly outdated b) very hard to use c) compromised through backdoors (and therefore vulnerable to any 3rd party attack) or d) increasingly criminalised by governments. The 5 eyes are pushing for back doors into encrypted messaging services. Even german politicians — generally known for their progressive approaches to data protection — are criminalising volunteers running tor nodes. The argument is always the same — protecting the public from child molesters, terrorists and drug dealers. With fear every law can be justified.

Outlook

I believe that people will wake up and fight for their freedom very soon. The climate crisis sentiment shift (which I didn’t expect to happen at all) might be repeatable for other pressing societal issues.

People are aware of what happens in Hong Kong and which role surveillance technology plays in that conflict. Mainstream media is slowly catching up with the topic and the activist community is growing quickly. Entrepreneurs are building next generation privacy tech in this very moment and I couldn’t be more excited about it. As an individual and as a venture investor.

This post was first published on my https://svrgn.substack.com/ newsletter, join us there.

--

--

Alexander Lange
equilibre

VC — founder @inflection.xyz — open economy | Ex Crypto Lead @IndexVentures @Earlybird, BD @Google | #opensource #openmoney #openfinance #openweb #openmedia