Why you don’t care about internet privacy. (And why you need to.)

After Mark Zuckerburg testified before Congress, my inbox was flooded with emails from tech giants about “updates to their privacy policy”. They were going to make data controls “easier to understand”. There was a lot of talk about “trust” and “community” and tons of fluffy promises to “do better”.

It was depressing. It’s not that I think Google or Facebook or any of these companies are evil, but it’s hard to believe that anything will really change.

Because after seven years of working with users in tech, I know the vast majority of people don’t fully understand internet privacy. What’s more, they don’t care or have given up, accepting daily data breaches as an inevitable “cost of doing business” on the internet.

Caveat: the vast majority of users still know a lot more than 98% of sitting Senators…

I don’t blame them. The internet breaks our outdated, analog definition of privacy, and the scale of the problem is so vast that it’s hard for humans to process.

  1. The concept of privacy has changed

Thirty years ago, privacy was pretty straightforward. If you were alone in your room, you could reasonably expect that no one was watching you — whatever you did in that space would be “private”.

If someone was there, following you around and jotting down whatever you said or did, it would be obvious that you weren’t in “private” anymore. You would filter your words and behave differently.

Today, it’s a lot less simple. It’s not just one person in a room with you; it’s a whole team of robots. They have data collection superpowers; they’re invisible, have perfect memories, can write at light speed, and their notepads are indestructible.

This robot army is almost always there, but they don’t always make people change their behavior. For most, the idea of “privacy” still means being physically alone in a room. They haven’t accounted for the invisible robot army because sites and apps don’t want you to — they know that being watched won’t make you feel good.

2. Putting the notebooks together

“But,” you might say, “I don’t care if Google knows a bunch of random factoids about me. So what if I “like” Taylor Swift and watched “A Christmas Prince” last night?”

This way of thinking about privacy is misleading; it trivializes the scale of the problem. It’s not about stand-alone fragments of information —it’s a holistic and contextual profile of you.

Let’s go back to 80s. If you went shopping, the knowledge the store has of you is limited to what the clerk remembers and how long they keep their security footage. They might know your favorite cereal brand or how often you shop there, but that’s about it. This data is relatively harmless — it’s got an expiration date (physical human memory) and it has very little context (the store doesn’t have information on your family, friends, or even your shopping habits outside of this particular store).

Fast forward to today. Google knows you’ve looked up directions to the store, so it knows where you’re going before you go. It’s tracked you en route via GPS. If you’ve stopped, it knows where and for how long. If your store account is linked with your Google account, they know what you’ve bought. Not only that, it knows all the other stores you’ve shopped at and what you bought there. It has your credit card information. It knows what your interests are. It knows who you’re friends with. It reads your emails.

It’s done this every day for as long as you can remember, and it’ll have this information forever. Even if its servers were obliterated by a rogue asteroid tomorrow, you’d have no way of knowing if or how often that information has been copied or stolen into other systems.

3...But what does it all mean?

This is an overwhelming amount of information, more than humans have ever had to manage before. There’s so much stuff that it’s easy to give up, wring our hands, and just not deal with it. It doesn’t help that most of this data genuinely isn’t interesting — no one cares about your grocery store receipts, what you listened to on Spotify yesterday, or what you ate on Tuesday. If this information were made public, it might be creepy but not necessarily harmful.

But mixed in with all the noise is sensitive data; information that you don’t want made public for whatever reason. This falls into two tiers:

  • The Obvious: This includes passwords, credit card numbers, and all other personal identifiers. It can also include direct actions — maybe you’re married with an active account on Tinder, or maybe you’ve sent an email to your doctor about a medical condition.
  • The Not-So-Obvious: This is the secondhand information that can be inferred from what you do online, and it’s where all your seemingly trivial data comes in handy. A married man ordering flowers for a woman who isn’t his wife. A patient’s browsing history on WebMD. There might not be a smoking gun, but you can make a pretty good guess at what’s going on if you have enough data points.

Most people make at least some effort to protect that first traunch of obviously sensitive information, but the Not-So-Obvious data usually escapes our attention. It’s easier to just ignore it than manage and analyze everything you’re putting out there all the time.

But it’s important to remember that this information is important. In the wrong hands, data can do a lot of damage. “Wrong hands” have destroyed credit scores and put people in debt. They’ve targeted vulnerable patients in recovery with predatory scams. They’ve even manipulated public opinion and interfered with an election.

4. The issue of security

“But,” you say, “Google and Facebook are state of the art tech companies! Surely, my data is safe with them…”

Corporations have proven that they don’t understand what it means to secure your data. Equifax compromised 143 million accounts — almost half of America — in its data breach. Sony was hacked by a simple phishing email. There’s an entire Wikipedia article dedicated to all the times Yahoo alone has been hijacked.

To make matters even more complicated, there’s the issue of sharing data with 3rd parties. This is what landed Facebook in hot water, but it’s an industry-wide risk. For example, Google might have state-of-the-art security, but if you log-in through Google and share your Google data with that hot new delivery app you just downloaded, your information is no longer protected by Google. It’s only as secure as the delivery app’s system.

Think about it this way; it doesn’t matter if you have a high-tech home security system if you leave your front door unlocked.

Theoretically, your data should be stored in discrete sets with explicit permissions for each set. That was the case with Facebook — the 270,000 users who gave Cambridge Analytica their data should only have authorized the release of their Facebook information. In reality, a loophole in the system also gave Cambridge Analytica the data of their friends; a total 87 million accounts.

5. Why is privacy really important?

There are a lot of people who want to invade your privacy. Your ISP provider wants to sell your web history. Facebook and Google want to run more effective ads. The government wants Apple to build a decrypted “backdoor” into all iOS devices.

These guys have all done a great job of making privacy feel like a cover for suspect behavior. Why would you be so up-in-arms about privacy, after all, unless you were doing something “wrong”?

(For the sake of time, I’m going to ignore addressing the fact that “wrong” is subjective depending on both cultural norms and your personal value system. Ex: The government has deemed free speech as “wrong” in Turkey, and in some cultures it’s not a “wrong” thing to set rape victims on fire for the sake of familial honor. But I digress.)

Privacy is something that you need in order to explore, gain insight, and make big decisions about your life. You might be a teenager trying to understand your sexuality. You might be contemplating a career change. You might be grieving. You might be reading up on ideologies that are different from what you were raised with — maybe Christianity if you live in Iran, or Islam if you’re in the Bible Belt.

The point is, if you don’t have the privacy to do this kind of self-exploration, you aren’t really free. Our perception of whether or not we are in private modifies the way we behave. Without it, at best we lose the ability to explore and be our authentic selves. At worst we face retaliation and persecution.

That’s the crux of why privacy is so important, and why we need to do whatever we can to protect it.

The good news is that there a lot of things we can do, both as individuals and as a society, to start fixing this mess. I’ll go into that another time, but it starts with fundamentally really, really caring about privacy — enough to change outdated habits and get legislators to pay attention. We need to collectively update our definition of privacy, recognize the value and power of our data, and start fighting back.

The most ironic part about the Facebook scandal is that Mark Zuckerberg knows this better than any of us. This is a man who tapes over his webcam, spent $30 million dollars buying and razing all the other homes on his block, and uses a private disposal company for his trash.

He’s probably not doing anything particularly interesting day-to-day. He just wants his privacy.


Christine Miao is a UX and product designer based in NYC. She used to work at McKinsey, but quit to do a start-up (her Mom still thinks she made a huge mistake). She now designs and builds things via Clowder Labs, attempts to write, and is most recently working on Emerson Journal; a daily journaling app designed to help you write everyday, fight FOMO, and feel better.