‘Jargonerics’ & the future of privacy: Interview with Rowenna Fielding (Part two)

Image by Third Sector www.thirdsector.co.uk

Rowenna Fielding is a self-confessed privacy nerd and Information Governance (IG) geek who began her IG career by coming out of the server room and taking an interest in information security, before broadening her horizons to include the other data protection principles as well.

In part one of this series, we caught up with Rowenna to chat about her journey into the privacy space, the data dilemma and the physical self. In part two we’ll chat about Jargonerics (another term for pointless privacy policies!), the state of data privacy and the future of the space…


Andy Baker: I came across a post where you mentioned a word I hadn’t come across before called ‘jargonerics’. Could you explain a bit more about the concept of jargonerics…

Rowenna Fielding: ‘Jargonerics’ is a word I made up to refer to the legalistic waffle in which most privacy policies and information are delivered, consisting of impressive-sounding phrases which convey no useful information to the data subject at all but keep auditors happy. The word is a combination of ‘jargon’ — ie, difficult-to-interpret technical language — and ‘generic’ — the vague and uninformative content which leaves individuals none the wiser about what is happening to their data. There’s a lot of jargonerics out there still.

It comes as a surprise to many people that you won’t find the phrase “privacy policy” anywhere in the GDPR

The main part of the problem is the assumption that to deliver information about privacy you have to have a ‘privacy policy’. That is; a dense wall of text somewhere on your website. That’s the American convention from US e-commerce law and it has often been confused with the transparency requirements of European data protection law. It comes as a surprise to many people that you won’t find the phrase “privacy policy” anywhere in the GDPR.

What data protection law does say is that somehow you have to convey particular information about what you’re doing with personal data, why that’s justified and lawful, and how you’re doing it; in a way that is understandable to the reader.

How you achieve that end is up to you. Unfortunately, everyone copies everybody else because it’s easier and then most people are scared to do anything different. Therefore we’ve been lumbered with this terrible convention of putting up a screenful of impenetrable lawyer-speak and leaving it at that. It’s demonstrably ineffective for informing people about the uses of their data and their rights.

AB: Do you think that’s something to do with protection? Are companies putting a wall of legal text to protect themselves rather than focusing on the user?

RF: I think two contributing factors that appear in varying proportions depending on the company is that they’re either not sufficiently committed to data protection to convey the information properly so they’ve done the minimum they think they have to do, or they’ve actively decided to hide what they’re doing with data because they know people will be outraged.

The convention of copying everyone else’s (bad) approach is probably based in risk-aversion; ie, everyone else does it this way so it must be safer — however doing so demonstrates that data protection is not well-understood or well-managed at that organisation and so actually increases the risk of getting into trouble.

Data protection practitioners often get asked to supply templates for privacy info, and this is a source of frustration because the content relevant to one organisation will be completely inapplicable to another. A template of how to arrange the information may go some way to making things easier (although tailoring it to your brand and tone will be much more effective), but the content is always going to be unique to the Data Controller.

There are some good examples of how to convey privacy information popping up here and there. The Information Commissioner’s Office have adopted a layered approach with videos and infographics for particular data journeys, then more detailed text which covers the legal specifics, for those with a pedantic disposition (like me!).

The basic purpose of providing privacy info is to ensure that data subjects are never surprised by what is being done with their data

Easyjet is another example of a company using video to demonstrate what they do with your data. There’s also an online firm called Juro, who have the most beautiful privacy notice interface I have seen yet. The content isn’t 100% — it could do with more specificity here and there — but the visual design and how the end user interacts to find the information is amazing. I think there’s a lot of design and UX principles that people could apply but are too scared to be innovative with.

The basic purpose of providing privacy info is to ensure that data subjects are never surprised by what is being done with their data — there’s a lot of scope for being imaginative and entertaining in getting that message across.

AB: Have you started to see a rise in subject access requests?

RF: There’s definitely been a rise in subject access requests that I have seen and I’ve certainly made more of my own since the GDPR (since it’s now free!). Despite this, the responses to my requests that I’ve received have usually been pretty substandard. So I wouldn’t say there’s an improvement in upholding people’s rights just yet. But there’s greater awareness.

Subject access isn’t the only right which people are more aware of. There’s also increased awareness of the right to erasure, however a lot of erasure requests I have seen sent in to organisations are based on the assumption that anyone can demand deletion of all of their data at any time. This is largely to do with the inaccurate media reporting around this particular right. It’s so much more limited than it’s being represented in the media and in rumour — there are many circumstances in which the right of erasure simply doesn’t apply.

If organisations are not saying up-front where and how users have the right to erasure, it leads to a lot of confusion and annoyance when people later go and try to get data erased and find that it’s not actually a viable option. However, the rise in people who are recognising and exercising their rights is a very positive thing. I’m all for that.

Some of the other rights are being heralded as ‘new’ but are merely strengthened in the GDPR from their previous existence in earlier law. The right to object to direct marketing, the right to challenge automated decision-making — these are now getting more prominence but unless an organisation’s privacy information clearly identifies where these uses of data are taking place, it is still difficult for people to know when and how to exercise them. This is why I believe the right to be informed and the principle of transparency are the foundation of all of the other rights. And that comes down to how privacy information is conveyed.

AB: What do you see as the future of privacy? Where do you think we’re heading…

RF: I think it’s worth looking at historical models for regulation. For example, when cars were first introduced, only very few people could afford them and because of this there were no road laws. A lot of people were injured or died as a result of the free-for-all on the roads and when regulation was finally bought in it wasn’t particularly well enforced for many years until a critical mass of public opinion arose to demand stricter standards. Now we’ve got to a point where its culturally-expected that the majority of people will behave considerately and safely according to an established set of rules when driving (although that’s clearly still not a 100% foolproof approach).

People are still being tracked, profiled and manipulated without their knowledge, and that’s the ‘unfair’ part

Another parallel is the introduction of electricity. When electricity first came into the UK there were no standards or safety checks. There were many independent suppliers of electrical power and they were more concerned with shutting each other out and taking their business than with ensuring that the end users were safe. As a result, houses burned and people died until the electricity supply eventually became standardised and regulated.

I think with the rise of huge amounts of data availability and degree of automation in processing it, we’ll probably see the same model emerge. We’re just starting the period where regulation is seen to be needed and it has been put in place. People are getting hurt — not necessarily physically, but their rights are not being upheld and this is leading to serious consequences for those individuals and for society as a whole. It’s going to be a fairly long road for that regulation to actually start to turn into cultural acceptance and if the regulation isn’t enforced robustly or consistently then that journey will be even longer.

AB: So in many ways you’re saying that data could in the future be seen as public infrastructure, such as roads?

RF: In some ways, yes. I think that uses of personal data can have such a profound impact on individuals that it cannot just be left up to the free market to decide everything — and the evolution of data protection law would seem to support that viewpoint. However, that doesn’t necessarily mean there has to be a whole new set of laws every time technology develops — as long as the basic principles of human rights and freedoms are kept firmly in mind, the details of how things are done technologically are less significant. If we focus on technology then we run the risk of playing whack-a-mole forever. For example, the original ePrivacy Directive refers to tracking online users by altering their equipment (ie, placing cookies or Local Stored Objects on their devices) — this has led to an increase in device, browser or behavioural fingerprinting as companies seek to evade the spirit of the law by paying attention only to the letter. The end result is that people are still being tracked, profiled and manipulated without their knowledge, and that’s the ‘unfair’ part. There’s always going to be a tension between the protection of the individual and the advancement of society and it’s never going to be a static equilibrium between them. Setting that esoterica aside, what organisations need to do in the meantime is simply — say what you do with data, then do what you say.

A huge thank you to Rowenna for chatting to us. You can follow Rowenna on Twitter or check out her website for the latest musings on privacy.


We’re in beta!

Tap makes it simple for individuals to see what personal data organisations hold about them, and then act on it. Join our beta.

*We don’t collect your personal data.

Join our community: