There’s No Such Thing as Anonymous Data. But This Is How They’re Playing us with It.

Don’t have time to read? Listen to the PrivateID Podcast

There’s no doubt that technology has helped us to advance at a fast pace and has leveraged us as species. I love tech. They usually say that with good things always come bad things. But is that true? Is it possible to achieve great things without any downside?

Credits

Great achievements are about to come, especially in healthcare and security. Achievements which require a lot of data in order to work. So the question we should ask ourselves here is: Should we provide anonymous data so we can get these great achievements? And if we do that, whoever gathers this data, can they use it for other purposes?

When I wrote Three Laws of Privacy, I received an email from a friend of mine making an important point:

“I’ve just read your most recent article about the three laws of privacy and I had a question. How does this apply to anonymized data? Would you expect the same guidelines to be followed?
“For example, when navigating using Google maps, you get traffic updates from other Google users in that section of traffic. Everyone’s data is collected and used to create the most efficient traffic routes. Currently, I have no idea if that data is collected anonymously or not (seriously doubt it), or whether it’s stored long term or not. Have you had any thoughts about this? Would anonymous data collection be exempt from the three laws?”

We shouldn’t overlook this, and while our System 1 — the fast, intuitive and emotional type of thinking—tells us that it’s obvious the use of anonymous data, there’s a deeper truth buried here. And in order to unveil this truth we’ve got to go through several layers, but first things first: There’s no such thing as anonymous data.

There’s no such thing as anonymous data

Let’s find out the nitty-gritty details about the technical aspect. I can’t give you a thorough explanation about this, so I asked a friend of mine. Here’s what he has to say about anonymous data:

“Any request to an external system is going to register the content of the message and traces (aka metadata: time of the request, time zone, IP address, language, browser, you go down the list). Whether the system you’re reaching holds that information or not, it’s out of your control. You don’t know. You can’t know. And you’ll never know. The only thing you have is their “word”. In the digital society there’s no such thing as anonymous data.”

It’s not about anonymous data, it’s all about the communication process. It’s about trust. And in order to get trust and people to buy whatever you sell, is to communicate your message properly.

A quarter-inch hole

Theodore Levitt, the author of the famous paper Marketing Myopia published in 1960, stated in the Harvard Business Review:

“People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.”

Over the years this has became a marketing mantra. In the end, selling the benefit is way more effective than selling the actual product. It’s a given.

Credits

This is exactly what these tech companies do. They sell you the benefits, the hole, but they don’t talk about the drill, which in this case comes with a lot of dangers. And if they do talk about it, they’d sell it to you as some lesser anonymous data.

Let’s forget for a second about the fact that anonymous data doesn’t exist. Let’s find an example about why sharing your data, anonymously, is the best thing you can do.

One of the biggest challenges we’re going to face with privacy is the boost the healthcare industry can get with open data.

Imagine this. Patients with cancer, before they know they have it, go to the doctor having the feeling that something goes wrong with their bodies. So doctors run some experiments on them, until they definitely know they have cancer. By that time, it’s quite late in the process. It’s going to be problematic, painful and expensive. And if you’re lucky, you survive it.

Now, imagine there’s an algorithm that monitors your health 24/7. An algorithm that, thanks to millions of data points from other patients, is able to recognize cancer patterns and tell you precisely when the cancer is forming at the very early stages. So you don’t have to wait and rely on your feelings.

If there’s something wrong with your body, you want the algorithm to find that out, through biometric sensors and detect the smallest anomaly in your cells when it’s easy, cheap and painless to solve.

Would you sign up for this? I would. Who wouldn’t?!

If this algorithm needs tons of data from a vast variety of (anonymous) patients, so be it.

If this is the benefit you’d get, and they don’t tell you anything about the data that’s being collected, I bet you’d still sign up for this. No one wants to know that. No one wants to buy a quarter-inch drill, what people do want to buy is the hole. The security it gives you to your own health…

At this point, unless you’re a radical defender of privacy, you automatically think: “okay, this is worth it — we shouldn’t let our desire for privacy get in the way of our own health”.

And this is exactly where I’d own you with this argument. Because I’ve appealed to your most basic common sense reactions with a shallow argument (but worthwhile benefit — don’t get me wrong), and right here I’d have an excuse to use your data for other purposes. You’d just think I’m doing this for you, but I’d be doing it with another purpose on mind. I wouldn’t tell you at all, and still, you’d be okay with it.

It’s for your own safety

This is just an example about healthcare, but notice how for a long time you could use that same argument, and just change the word “healthcare” for “security”.

This is about Maslow’s hierarchy of needs. You can use the basic need of security, and appeal to people by giving them a shallow argument that covers their most basic needs. If you think about it, it makes total sense. Let’s see Maslow’s pyramid for a second:

Abraham Maslow, in 1943 published a paper in Psychological Review called “A Theory of Human Motivation”, where he proposed the theory of Maslow’s hierarchy of needs. It’s a system that describes the stages human motivations go through. These motivations are (in order of importance):

  1. Physiological
  2. Safety
  3. Belonging and love
  4. Esteem
  5. Self-actualization

This means that each level requires to have the prior one “secured”. For example, you can’t have belonging needs until you fulfill your physiological and safety needs.

In the case of physiological needs, these are the ones you can’t literally live without. These are what makes your body work and procreate. Once you have your physiological needs covered, you can focus on your safety.

If you think about it, politicians focus a lot on safety arguments. Sometimes it’s what they should do, but more often than not they do it so they can grab your attention by talking to your reptilian brain.

Or consider insurance companies. They appeal to these basic needs so you can feel secure. That’s what they sell. And this is the exactly what we’re going to hear about when they ask for our “anonymous” data. They’ll sell us the quarter-inch hole: Safety.

Don’t trust me yet?

Okay, here’s how they’re going to sell it to you — I’m going to put the marketing’s hat here. Pick any car manufacturer. This is what they’re already saying:

“Share your data with us: we’ll use it to make the roads safer. Cars will communicate with one another, so you will feel safe.”

But this is what’s up: The car industry is in decline. They’re not going to sell as many cars in the future as they’ve done in the past. So, as car manufacturers, they’ll try new businesses — like selling electric scooters. But they’re just getting into the data business. That’s how they’re going to make money. And as always happens, advertisers will enter the scene and show you pop-ups, pop-unders and all annoying kinds.

Most car companies are doing this. Ford is doing it, and publicly said in the Freakonomics podcast that they know a lot of things about Ford owners:

“The issue in the vehicle, see, is: we already know and have data on our customers. By the way, we protect this securely; they trust us. We know what people make. How do we know that? It’s because they borrow money from us. And when you ask somebody what they make, we know where they work; we know if they’re married. We know how long they’ve lived in their house, because these are all on the credit applications. We’ve never ever been challenged on how we use that. And that’s the leverage we’ve got here with the data.”

The upside is clear, and we should seek it. More than a million people die on the roads every year, and I believe autonomous cars will be able to help us improve dramatically that number. However, there’s a caveat here. We can have both realities at the same time: Safer roads and better privacy. But we know how this is going to end up. Car manufactures are in big trouble. So they better find a way to substitute the loss of sales with new incomes —and data is their best candidate.

This is just one industry, but this thinking is spreading faster than we think. The biggest one? Healthcare.

Do people care about privacy? Should they?

Let’s pick again the healthcare example I talked about at the beginning of this article. Being honest, almost everybody would sign up for something like that.

Credits

But as Yuval Noah Harari pointed out in a recent interview, here we’d be dealing with a big temptation that comes with a “long tail of dangers”:

“One of the biggest battles in the 21st century is likely to be between privacy and health. And I guess that health is going to win.
“Most people will be willing to give up a very significant amount of privacy, in exchange for far better health care. Now, we do need to try and enjoy both worlds to create a system that gives us a very good health care, but without compromising our privacy. Yes you can use the data to tell me that there is a problem and then we should do this order to solve it, but I don’t want this data to be used for other purposes without my knowing it.
“Whether we can reach such a balance and have-your-cake-and-eat-it-too, that that’s a big political question.”

A couple of months ago Joe Rogan interviewed Elon Musk. They talked about a lot of things, but here’s what Musk said about privacy:

“I think there’s not that much that’s kept private that’s actually relevant, that other people actually care about. We think other people care about but they don’t really care about. And certainly governments don’t.”
“National spy agencies do not give a rat ass what porn you watch — they do not care”

Leaving the porn thing aside for a second, it is true that most people don’t care about privacy. But that doesn’t mean we shouldn’t have the right to have privacy. In the same way, most people don’t care about politics, but that doesn’t mean they don’t have the right to get trustworthy people in power.

Credits

I really like Elon Musk, and I believe in the change he’s trying to make. But I don’t agree with his views on privacy.

Here we enter into an interesting debate. Talking about individual’s privacy is a whole different debate we should have, but that’s not the only place the conversation should go, because we’re missing the point of lack of privacy at a mass scale.

“Privacy is the right to a free mind. Without privacy, you can’t have anything for yourself.” — Edward Snowden

Again and again, we’re focusing our attention on the wrong arguments. Whether we talk about anonymous data or not, we’re so easily influenced that we agree on whatever they say. We blindly ignore all the long tail of dangers.

This isn’t about our whim to keep our privacy (a legitimate whim). It’s about defending our own right to have thoughts by ourselves.

Let’s not forget that Cambridge Analytica didn’t target individuals in particular. They used millions and millions of profiles (which could’ve been completely anonymous and the results would’ve been the same) and created the ultimate mass manipulation algorithm.

Why don’t we talk about that?

Do you have something to hide?

And here we go again, with the most widespread message against privacy:

“The only people who don’t want to disclose the truth, are people with something to hide” — Barack Obama

Obama said that. And people believed him.

It’s all about trust. And trusting the wrong source is dangerous.

Who should we trust?

I get it, we need to trust some entities to do the right thing. But that trust has to be earned over time. How can we get to trust companies while they appeal to our basic needs in order to get our approval on something?

How can we trust these companies while we’ve got no clear idea of how that data is used? Yes, we see one part, the part that gives us some benefits, but we do not see the downside.

Should we trust tech companies to collect our data “anonymously”?

Where’s the line?

With these questions we open an interesting debate and there’s a dichotomy here, where we separate the individual’s identity and his or her right to privacy, and collective good.

Privacy isn’t about just covering one’s identity. This actually matters more to us than to them, and by that I mean that what they care is about having control and power over our collective version.

For example, Facebook cares about getting specific about each one of us, but there’s also a different version where they care about our collective knowledge. This can lead to several scenarios, but with all this knowledge, there’s nothing stopping them to replicate us in a virtual world as task rabbits and use our “anonymous” data to improve their services — services that learn from us and take over our jobs. And people are just okay with that.

Do you see where I’m going with this?

It all comes down to trust. They know this, that’s why they choose thoroughly the way they communicate. So we should be very careful who we decide to trust, because the stakes are high.

There’s a way to get the benefits without getting this long tail of dangers and not turn us into cash cows for them. Don’t you ever forget this, because they’ll try to persuade you to believe otherwise.