August Macke. Mädchen im Grünen. The Yorck Project (2002) 10.000 Meisterwerke der Malerei (DVD-ROM), distributed by DIRECTMEDIA Publishing GmbH. ISBN: 3936122202., Public domain.

Big Tech loves privacy — as a frame

How information ideology prevents us from seeing the obvious.

Joop Ringelberg
4 min readApr 14, 2020

--

“Privacy” conjures up “personal data”, that is, information about persons. It evokes questions like Who owns the data, Who is allowed to store the data and Who is allowed to sell the data? Big Tech uses it to muddle the debate, like Facebook did while it maintained it didn’t sell ‘your data’ — meaning that it did not sell data you actually typed (all the while keeping its fingers crossed behind its back)(1). And it didn’t, but harm was done by auctioning data harvested by spying on people’s behaviour, and conclusions drawn from that data regarding their sexual preferences, socio-economic status, etc. etc.

Privacy is an elusive concept that, judging from its usage, comes in quantities. One thing is for sure: we do not have enough of it. That opens up endless possibilities for the industry to promise more of it. Facebook can go on to improve their handling of privacy for years to come without really touching the issue.

All this centers around the notion of information as a physical thing that somehow carries meaning. Once you have the data, you have the meaning. We can then haggle over ownership etc. But this sets us on the wrong foot.

Especially, it gives us the false impression that this is how computers operate. That alas, there is no alternative and that the only way to repair the situation is by regulation or self-control.

But that is not true. The kind of IT we’re talking about — webshops, taxi services, dating services, holiday house rental — does, when boiled down to essentials, just two things: it covers distance in space and in time between people. Space is obvious: just think of telecommunication. Time: think of a shopping list. Essentially that is a message to your future self, covering distance in time. Pause for a moment to think how this such programs are about co-operation and how they differ from text processors and spreadsheets.

While co-operating, people act and others take note of that. I might act by speaking, causing air to vibrate which is then felt by my partners and miraculously interpreted so I have, in effect, conveyed my thoughts. Co-operation happens in a context where people have roles. We know our own role and that of our partners. We know what they need, to function properly. If they need to know what we think, we make sure they do — by speaking, writing a letter or using a computer service. And conversely we expect to be informed by them.

Now this is the proper starting point of an IT project for co-operation. Such a project should start by an analysis of the roles involved in the contexts that need support. It should list the actions that role-players perform and how they need be informed to do so correctly. When done properly, we have spelled out who needs what information. We will find that, because we’ve included the contexts in our analysis, attributing the right meaning to data is far easier. Easier, that is, than when a disembodied ‘information model’ is created that has lost sight of who speaks, who listens and in what context.

Moreover, such an analysis is perfectly clear as to whom should have access to what data. So much so, in fact, that software can be generated automatically from it. Just take that from me, in a leap of faith, for the sake of the argument! Just assume that there exists an infrastructure that ensures that everybody taking up a role as modelled, is informed correctly — automatically. “Correctly” meaning: just what is needed to act, nothing more. It turns out that such software ideally should run completely distributed — adding to confidentiality.

But, far more important, ‘privacy’ would be the heart of the design. It is not something added as an afterthought. It is not something that one can ‘forget’ while building an IT system.

This is not how the IT industry operates. It is shaped by the ideology of information as physical carriers of meaning. This ideology promotes the thought — falsely! — that having data means having gold. Thus, IT starts with modelling ‘the information’. Only then come questions like who should own the gold.

Could privacy concerns arise, in a system based on our hypothetical context-role analysis? The concerns being that someone would be informed about other people’s actions while they were not meant to. No! because it would require that we model it (remember, you promised to believe that everybody in a role is informed correctly)!

So, under such an approach to IT, a privacy leak would have to be designed. How about that for a change!

So that is why Big Tech loves privacy. It is a red herring. What really needs to change is the technology, the method — the way we think about systems that support co-operation and how to build them. Privacy is a frame, acting to make us helpless and accept the technical status quo. We need a Copernican revolution that, this time, puts man back into the centre, to make sense of things.

(1). I wrote this column in November 2018 but never got round to publishing it, so this reference to Facebook goes back to that time.

This is the eleventh column in a series. The previous one was: To Cut Nature at its Joints. Here is the series introduction.

--

--