Algorithms, Dictators, & The Surveillance Economy

Edward W (Semantic Paint)
The Graph
Published in
9 min readNov 15, 2016

A behind-the-scenes look at how our content is dictated to us.

Browsing on the web is a funny thing. On one side of the screen we’re busy looking for information or entertainment, and the other side algorithms are busy looking for information about us. Sort of like a 2 way mirror, and we only see the mirror side. Its slightly unsettling.

After digging into this, I’ve discovered there is a lot of stuff that goes on between us and the web I didn’t know about. Stuff that affects what we get to see. Below is what I’ve learned about the algorithms that push content to us while they pull data from us.

First, a screenshot that is both funny and scary at the same time.

Google — now with extra relevance

Note the search term I entered, and pop-up in the upper left corner. Its a real picture taken from my laptop. I had to take a screenshot, it was too surreal to ignore.

I found it funny for a couple reasons.

One, I misspelled ‘surveillance’, and now the whole world can see that. I’m OK with that, ‘surveillance’ is actually a pretty hard word to spell. Really, try to spell it without looking at the word. It ain’t easy.

Two, I was looking up a phrase — ‘surveillance economy’, and Google was nice enough to demonstrate exactly what I was searching for. I mean this is service above and beyond what I was expecting. I was hoping for search results, but they didn’t just give me search results, they actually tried to find out where my exact location was. They revealed the surveillance economy better than any search result could have. Perfect relevance, unplanned and hilarious.

And a bit scary. I hadn’t even got to the word economy (in ‘surveillance economy’), I had just typed in the first part of the phrase, and Google had both offered search results and offered to put me under surveillance. Not sure if that is extra generous or extra creepy. Should I thank them or tell them to piss off? And the strange thing is, I don’t know why Google wanted to know my location. I wasn’t searching on a map — I wasn’t looking for directions to anywhere. What did I trigger? What is going on behind the scenes here?

I don’t have a lot of answers, but I do have some. Here are some things that were interesting.

Recommender Systems

Go on the web, and every choice you have in front of you is decided by an algorithm. Like an authoritarian father figure who takes control, these algorithms dictate what content you need to see and what content to steer you away from. Faced with the flood of information that we have to wade through whenever we search for content, we are forced to rely on these algorithmic pied pipers to help us find we are searching for. They lead, we follow.

Choices? I’ll give you your choices!

These algorithms are housed in things called ‘recommender systems’ — complex decision-making machines that, unsurprisingly, ‘recommend’ content. They recommend based on filters, and they serve as real-life windows to the world of information. The good thing is not all these systems are the same, although many are cut from the same cloth.

Filter 1 — The past you VS the potential you

Many recommender systems base their recommendations on the choices you made in the past. Works great for machines that stay static and unchanging. Not so ideal for humans with the potential to grow and change.

Ever gone on a trip somewhere warm, and when you returned every ad you saw on youtube or through outbrain was an ad for travelling somewhere warm? AFTER you already returned from your trip?

Its because these recommender systems base their recommendations on your past behavior — on who you were. It’s just tough to engineer an algorithm to help with current user goals. It’s possible to do, just not common.

Filter 2 — the money generated by surveillance

Recommender systems that use your past behavior to inform their algorithm have a pretty unpleasant demand. They need to know your past behavior. Every bit of it. Which means they rely on surveillance.

There is no nice way to put it. The algorithms recommend recipes by referencing the types of recipes you’ve searched for in the past, recommend doctors by recording the type of doctor you’ve searched for in the past, and recommend news articles by knowing the types of articles you’ve click on in the past.

Google, Facebook, Youtube, and other industry giants have created an entire economy around surveillance. Their business models are based on trading data about you for money. They get paid because of the data they have collected by surveilling their users. You are the product they are selling. And if they get paid for collecting data, you can be damn sure they are going to collect as much of it as they can. They are going to filter and analyse and send you where works best for them.

Not the best environment for untarnished windows to the world of information.

There is a name for this economy. The surveillance economy. There are a couple different economies going on, which can get confusing, so let’s clarify this quickly.

It all depends on which side of the fence you sit on.

The attention economy?

Corporations need to think about how to survive in the attention economy, where users are constantly paying attention to different things. As attention spans go down and distractions go up, you are going to hear more and more about the attention economy. Financial transactions may not be the most important thing out there, attention transactions may become the biggest priority.
Seth Godin summed it up perfectly 5 years ago:

Every interaction comes with a cost. Not in cash money, but in something worth even more: the attention of the person you’re interacting with.- Seth Godin sethgodin.typepad.com (2011)

It’s actually really interesting stuff, but that is a rabbit hole all on its own.

Moving on to the surveillance economy.

The surveillance economy

Individuals need to think about how to survive in the surveillance economy, where companies are constantly finding ways to get more and more detail about the personal lives of their users. Call me crazy, but I’m not so sure you are going to be hearing a lot about the surveillance economy. It just feels like online media companies won’t be rushing to say “putting users under constant surveillance is a bad thing”.

This perspective is pretty relevant if you use WhatsApp. It uses the highest standards of encryption to keep your data private. So that only Facebook can snoop on that data.

Hey facebook — I’ve got an idea. How about keeping your nose out of my business? It’s a personal thing, in every sense of the word.

Back to filters.

Filter 3- The brick vs the butterfly

Algorithms conglomerate data into pigeon-holes. Something along the lines of “these people are all similar, so they all go into the same pigeon-hole”. The industry calls it ‘collaborative filtering’. Its like the algorithm says “Most people who are interested in this are also interested in that, so we will compact them into the same group”.

Pink Floyd seemingly foresaw this approach and commented on it in one of his iconic songs. There is a wall of big data. You are just another brick in the wall, piled in with all the others who are just like you.

Except you aren’t just like them. Or hopefully not. Maybe we aren’t all beautiful butterflies, but I’d argue we are at least bricks with character.

The strategy of squashing a bunch of people with roughly similar search habits into the same category makes it easier for recommender systems to generate recommendations. It does not make it easier for you to find material that reflects your unique interests.

Engineering Serendipity

So there are a lot of problems with how content gets delivered to us. How can a system find the exact thing you are looking for, the perfect content that you didn’t even know was right till you saw it? How can it deliver delightful content while respecting users’ privacy and uniqueness?

The problem is, we need a bit of ‘collaborative filtering’, that term I was so harsh on earlier. There’s a useful side to this filter.

Collaborative filtering builds a model based on 2 things,

1) our past behavior;

2) similar decisions made by other users.

This model can then predict stuff we might be interested in. Which overall is invasive, but also usefully predictive. It helps recommender systems give us the content we want. If we get rid of it, we are metaphorically throwing the baby out with the bathwater.

Which isn’t ideal. I support safety for babies, even if it is just a metaphor.

So I’ve got a suggestion. Change the business model. Get paid for different things.

Instead of getting paid to snoop and sell content about users, set up a business model where you get paid based on how delighted your users are with the content you push to them. Make money based on the relevance of your suggestions. Not so much based on the data you collected.

Or how about getting noticed, and paid, for being up front about how content gets chosen? I for one would much rather interact with a system that tells me what its basing it’s decisions on. Me guessing about why the system wants my location really doesn’t work out positive for anybody.

And seriously, if I could get behind the mirror and do the adjusting myself, that would be very cool. How about setting up a system where we can tweak the knobs that weigh and measure and decide what gets pushed to us?

CEOs and programmers out there, are you listening?

Electing leaders

There is a point to this post, other than getting on a soapbox about the sad state of our windows to the web. Here is that point.

Business models that profit by surveillance and use the past you to direct recommendations are getting old. They can’t live on forever by exploiting data collected about us and assuming we are static machines.

Well, they can live on forever unless other business models appear that sell relevance and quality and offer us choice. If business models emerge that help us explore and assume we are individuals who have the potential to develop and grow, those old models based on exploiting our data are going to get a shakedown. A breakdown. A takedown. Ya.

If I had to choose between a typical search engine and a site that gave me transparency, choice, and great content, I would choose the site over the search engine anyday. Wouldn’t you?

So talk about it. Bring up the surveillance economy. Discuss and share information about it. Get people talking about what’s wrong with the current business models that lead us to content on the web.

Make transparency and choice trending topics. In a very real way, demand to elect who is on the other side of the mirror.

And thanks for reading.

Material referenced in this post, in order of importance:

Bob Seger — Shakedown

Recommender systems -wikipedia

The Attention Economy and the implosion of traditional media

Collaborative Filtering- Wikipedia

How secure is WhatsApp? How the surveillance economy stopped being fashionable

--

--

Edward W (Semantic Paint)
The Graph

Fascinated by how we create meaning. Communicator, dad, cultural observer. Constantly learning about Embodied Cognition, Metaphor, and Semiotics.