The right to not be remembered
“Data is the new oil” is a deeply flawed metaphor. In addition to being dehumanizing, it’s flat out misleading.
“Data” is not some dormant material that’s being excavated from us, it’s whatever other people have chosen to observe, interpret, and remember about us. Data is not who we are, it’s what falls off of us as we interact with the world.
I prefer to think about data as being like the shadows we cast as we move around. They are not us, but they are made possible by us. They’re visible for all to see, leaving a fleeting impression in physical space, but they don’t directly reveal anything about our inner workings. Our shadows change shape depending on time of day, surroundings, and our position and orientation in the environment.
When we move around in public, our data, like our shadow, is exposed for all to see. Other people, if they were observing us, could notice where we’re standing, what we’re wearing, whether we’re buying something, who we’re with, how we’re moving, etc. But the default social contract is that the people around us aren’t trying to remember these details in the first place. It’s impolite to stare and downright illegal to watch our every move. And even if others were trying to observe us, we can rest comfortably in the knowledge that their memories are full of little errors. Errors that offer us plausible deniability and reasonable doubt. We can presume, therefore, that our “shadow” data is appropriately fleeting and impermanent.
Unfortunately for our digital data, the opposite has become the status quo. Our digital selves are under constant observation. Everywhere we travel, we leave a trace, and the default setting is to be recorded, scrutinized, and gossiped about. And what’s most problematic is that 1’s and 0’s don’t afford us the same plausible deniability that comes with the fallibility of human memory.
Common sense privacy
People shouldn’t be forced to constantly wrestle with the ramifications of their actions when they’re online or around connected devices. Lurking questions like “who does this system think I am (when they show me this ad)?” or “what am I committing to becoming (if I watch this video)?” or “who might be able to re-identify me and steal my identity (if I sign up to this)?” are just too damn existential for a few swipes on your phone.
Furthermore, privacy controls for AI-powered systems are fundamentally insufficient if they presume we’re capable of understanding the tradeoffs that go along with in each setting of a complex, dynamic, and tightly coupled system.
So let’s strive for common sense. Privacy by design should mean going beyond the “right to be forgotten” and returning to the social contract we already have in place: the right to not be remembered in the first place.
Food for thought
- How have we earned the right to be trusted by the user?
- What is the benefit to the user—what is the “value exchange”—of storing their personal data outside the device of origin?
- How might we afford users the ability to perform a “test drive” where they can evaluate direct personal benefits before being asked for consent?
- Have we given users the ability to verify that nothing is being remembered about them?
- How might the system support an “airlock” capability where personal data disappears unless explicitly authorized to transition from the device of origin?
- How might consent operate in situ instead of a priori?
- How might we make users aware whenever their personal data are in use?
- How will users be able to verify that when they delete their data, it’s actually gone?