7 Things I Teach My Kids About Privacy
I have studied the data landscape for the last fifteen years. Large segments of our lives that were once private are now recorded, stored, and searchable. Where we drive (are we driving to the doctor? The therapist? The marriage counselor?) is just one example. Now, the bedrock of our lives are recorded and exposed by sensors around us, and the data is used to determine the best way to sell us commercial and political ideas.
I worry about that. And as a parent, I am trying to figure out how to talk to my kids about how to live with what I see coming down the track. And there are a few pieces of advice that I have given to my kids over the years to keep them safe from the ever-impending threat to their privacy.
1. You don’t control the device.
Smart devices answer to their manufacturers and the people who provide the operating system software (and to a large extent, the apps), not to the user. The user is just a user. It is nearly impossible to take control of the devices we use — you would have to jailbreak or hack the device and replace its operating system. Smaller, intermediate hacks can give you full control over the device, including the ability to delete some apps that systematically spy on you, but the next over-the-air update will take your control away from you again. The fact that you don’t control the device means that the data collected by the device is not under your control. Whether or not you hit your move ring on your apple watch today is a fact that belongs to Apple, not you. Apple just lets you see it.
Knowing that you don’t control your device means that the best plan is to only expose content you are comfortable with to your smart device. That really sucks, given that these devices are pipelines to life and reality. And with sensors proliferating, more of your life can be sensed, tracked, and sold by big data companies. Unfortunately, this is our reality. The device leaks your data. You don’t control it.
2. Give false data.
Because you don’t control your device, you can’t stop it from leaking the information it gathers with an ever-increasing number of sensors. What you can do is provide false information where possible, to make the algorithms one step less sure. Neural nets rely on extracting many features (features like your age, zip code, wage range, distance from work, time spent clicking through social media posts, and so on), to make determinations about profiles. The big data system works by finding connections between things people have in their profile to political views they express or things they buy. People with this kind of profile are highly likely to do that (buy a certain way or vote a certain way) in response to a commercial or political appeal. If the information is the profile is fake, it is harder for companies to make these connections.
It helps to lie, to fuzz the profile, to make automated systems one step less likely to pinpoint what time of day we are likely to pay most for a product or service, one step less likely to be able to present false information that is politically persuasive.
3. Protect your connection.
Use a VPN that does not log connections, has made public representations to that effect, and is located in a jurisdiction in which those promises are enforceable as a matter of law. Check to make sure there are no data scandals around the VPN, read reviews, and look at discussion boards to make sure the VPN has a good registration.
A VPN was traditionally used to keep a user’s IP address private. But that is less important now, given browser fingerprinting and timing techniques that make private browsing very nearly impossible. It still helps, every bit does, but another major point of running your connection through a VPN is that it is not transparent to whoever is providing your internet connection. Whether school or work connection, an internet connection offered by someone else is a security and privacy risk.
4. Use encryption and an Adblocker.
There are some techniques that help. I teach my kids to use two techniques in particular, one that works technologically and one that works socially. The technological move is to encrypt the device and connection whenever possible. Encryption works. Sure, encryption standards fail or are compromised, but the technological ecosystem recovers pretty quickly from those attacks. Encryption is like a lock on your device. Yes, locks can be picked, but we all prefer to sleep at night with a lock on the door.
The social technology that works is adblockers. Companies traffic in our data without meaningful choice or consent. We do not really consent to the collection and resale of vast amounts of our data just by using electronic devices, even if the legal system tries to construe our continued use as an agreement to be surveilled. Since our desire to avoid surveillance isn’t respected, our best option is to make sure that the companies don’t get to profit from it. We are not allowed to use the internet without our data being taken and sold for profit. (If you think you can, see number 5, below.) So the best way to restore the balance is to stop companies from profiting through advertisements. Until we have the legal right to stop the surveillance, we must exercise the practical right to make the surveillance unprofitable.
5. Realize it doesn’t work.
It doesn’t work. Despite every precaution, our current information-device landscape has been so constructed that our data is being constantly grabbed and used. There are things we can do: using a VPN, clearing cookies, using Firefox with privacy extensions, encrypting phones, ensuring secure connections, and maybe even hacking and replacing the operating system on the phone. Despite every precaution, ISPs, app creators, mobile operating system developers, and online advertisers log our metadata, sell it to other companies, provide it to the government without the requirement of a warrant, or simply sell it to the government, as was the case with mobile tracking data recently purchased to trace the movement of consumers in response to Covid-19.
There is value to the exercise of trying to protect our privacy. It exposes the lie that our complete lack of privacy is our fault. As the argument goes, if we just would simply read the incomprehensible contracts that governed our use of devices, data collection, and internet-enabled technology, we would have the choice, the ability to protect our data. In fact, we do not actually have that power, and it is not our fault that our data is taken from us despite every possible attempt to protect it.
6. Do it anyway.
The whole exercise does feel futile. Google, Facebook, Apple, Amazon, and every other tech company are experts at getting your data. They have built empires by strip-mining people for information, with devices as the mining tools. We bleed information. The fact that there is an ever-flowing stream of data information does not mean that we shouldn’t try to tend the wound. We can’t stop all the data flow, but we have to try.
We have to try, at the very least, because the companies do not deserve our data, and we do deserve our privacy. We need to keep data out of their hands as much as possible for the same reason that we take stolen money away from bank robbers — they don’t deserve it. Our data has been taken so that companies can turn us into profit centers and political tools. We deserve to be able to use products without being watched. We deserve to know and control what happens to our data. Until a responsible legal solution becomes politically feasible, which seems a distant possibility, our best defense is to add chaos into the system and take measures to protect ourselves any way we can. Not necessarily because it will work, but because it is what must be done
7. Get your friends to do it too.
Because our society lacks robust privacy protections, our data is available through our friends as well. As an example, you and your friends go to get ice cream downtown. You have your location services turned off, so you think you are protected. One of your friends does not. They take a photo and post it to Facebook. The photo is tagged to that location with your face in it, and subsequent algorithms sweep photographs from social media to make further inferences.
Privacy protection must be a collective effort, for two reasons. Privacy norms are contagious and beneficial. Other people who are careful with data will also take better care of your data. There is also a network effect. The more of your friends use a secure messaging platform (Signal for text messages, for example), the more valuable that platform becomes. If enough people transition, the new secure method of communication, in which ISPs can’t read your text messages, supplants the insecure one, in which they can. When we take enough of our friends to an open-source platform with verifiably effective encryption, we make large, real strides toward functional privacy.