The right to not be remembered

Josh Lovejoy
4 min readJan 13, 2020

--

Data is not the “new oil”. It’s not some dormant material that gets excavated from us; “data” is whatever other people have chosen to observe, interpret, and remember about us. Data is not who we are, it’s what falls off of us as we interact with the world.

I prefer to think about data as being like the shadows we cast as we move around:

  • Our shadow is not us, but it is made possible by us.
  • Our shadow changes shape depending on time of day, surroundings, and our position and orientation in the environment.
  • Our shadow is visible for all to see, leaving a fleeting impression in physical space, but does not directly reveal anything about our inner workings.

When we move around in public, our data, like our shadow, is exposed for all to see. But the default social contract is that the people around us aren’t trying to remember these details in the first place.

Sure, if other people make it a priority to track our behavior, they can pick up on all sorts of details and patterns: Where we are, what we’re wearing, what we’re buying, who we’re with, how we’re communicating, etc. But it’s impolite to stare and downright illegal to watch our every move.

In addition, human memory is full of little errors that afford us plausible deniability and reasonable doubt in the event that someone were to accuse us of some claimed impropriety. These memory errors are like little pieces of armor against the fear of a looming permanent record, of gossipers, and of tattle-tales.

As a result of our social contracts and imperfect memories, when we leave the house and move about the built world, we expect that our data will be appropriately fleeting and impermanent, just like our shadow.

Unfortunately for our digital data, the opposite has become the status quo…

  • Everywhere we go, we leave a trace.
  • The “default setting” is to be watched, recorded, and scrutinized.
  • 1’s and 0’s don’t afford us the plausible deniability that comes with the fallibility of analog media and human memory.

Our digital selves are now under constant observation, fueled by the incentives of surveillance capitalism. The social contract that’s helped us explore our authentic selves has been profoundly damaged. But it can be mended… or at least, we have to try.

Common sense privacy

People shouldn’t be forced to constantly wrestle with the ramifications of their actions when they’re online or around connected devices.

Lurking questions like “who does this system think I am (when they show me this ad)?” or “what am I committing to becoming (if I watch this video)?” or “who might be able to re-identify me and steal my identity (if I sign up to this)?” are just too damn existential to be handled by a few swipes on your phone (or a buried link to a jargon-filled privacy policy).

Furthermore, privacy controls for AI-powered systems are fundamentally insufficient if they presume that people are capable of understanding the tradeoffs that go along with in each setting of a complex, dynamic, and tightly coupled system. It’d be like asking Amazon users to weigh the impact of each online order against the impact it might have on locally-owned neighborhood businesses. We’re simply not used to having to start by distrusting the motives of the big systems we rely on.

Inviting critical thinking

Asking people to entrust their data with us shouldn’t be something taken lightly. We should expect—and even encourage—our users to operate with a healthy dose of skepticism about our intentions.

Here are some recommended questions to ask when setting out to design with privacy as part of your product’s design language:

  • How have we earned the right to be trusted by the user?
  • What is the benefit to the user—what is the “value exchange”—of storing their personal data outside the device of origin?
  • How might we afford users the ability to perform a “test drive” where they can evaluate direct personal benefits before being asked for consent?
  • Have we given users the ability to verify that nothing is being remembered about them?
  • How might the system support an “airlock” capability where personal data disappears unless explicitly authorized to transition from the device of origin?
  • How might consent operate in situ instead of a priori?
  • How might we make users aware whenever their personal data are in use?
  • How will users be able to verify that when they delete their data, it’s actually gone?

Going forward

Privacy is not an end state, it’s a means by which we express control over how we’re perceived by others. Therefore, privacy by design should go beyond the “right to be forgotten” and strive to return to the social contract we already know and love: the right to not be remembered in the first place.

--

--