Digitize Your Privacy with NFTs

What if we represented our expectations about privacy in a non-fungible token that is tangible, shareable, and monetizable?

Kevin L. Miller
Privaceum
6 min readOct 28, 2021

--

This first article in a new series about privacy discusses the use of several blockchain token standards (including one potentially new token standard that extends ERC-721) to transform the abstract concept of a “privacy preference expectation” into a tangible, shareable, and even monetizable digital proxy. I’ll also introduce key concepts and challenges in the privacy problem domain that I believe are capable of being solved using blockchain’s peer-to-peer, transparent, flexible architecture.

In later articles in the series, we’ll discuss diverse topics such as:

  • trading privacy rights (or, rather, the meta-right of “having one’s privacy expectations enacted”)
  • blockchain-enabled auditing of off-chain data and related dispute mechanisms
  • the construction and governance of canonical term hierarchies and heterarchies using blockchain’s consensus infrastructure
  • new methods for decentralized governance
  • and many more cool things….

So, buckle in, because it should be an interesting ride!

Privacy is Personal, Contextual, and Dynamic

People are mobile — they move around in different environments, surrounded by different devices with sensors, any of which is capable of violating their privacy. People’s privacy expectations also differ by where they are and the role they are playing when there. For example, my expectations about privacy in my own living room are different from my expectations in a coffee shop. My expectations in a hospital when I act as a doctor are different from my expectations when I am a patient. And so on.

Privacy expectations have multiple dimensions: they include data privacy as expressed in privacy laws such as the GDPR, certainly, but also encompass the personal rights encoded in legal concepts such as intrusion on seclusion, wiretapping, battery, biometric privacy, and even common culturally-relative norms like the amount of personal space one expects in public places.

The point is that my privacy expectations, or “what I believe my privacy should be and how devices and sensors in my environment should adapt to respect it,” differ based on who I am, where I am, and the role I fill while there, and sometimes those expectations must be further adjusted to accommodate the expectations of others nearby.

Privacy Expectations as Tuples

Technological systems are the wellspring of many privacy problems. But, if technological systems can slice and dice our larger behaviors into bits and bytes for digestion by marketing engines, then it should also be possible to organize our privacy expectations so that they can be used to control how devices gather our private data.

We don’t tend to think about our privacy expectations discretely enough — we typically look at issues that surround privacy as an on/off switch, and the bluntness of the on/off motif makes it difficult to solve the problem.

We can make progress if we reformulate our understanding of privacy expectations such that they can be represented as privacy rules, or “privacy tuples,” that describes the constraints we expect to be placed on device predicates in a given context and role. The sum total of these privacy tuples, as they apply to us, define the contents of our individual “privacy identity”.

Basically, privacy tuples, or “rule tuples” consist of a predicate and its constraint setting organized by context and role. A set of one or more rule tuples, each defined by a three-part key and a setting, forms a “privacy ruleset” that defines the behaviors expected for predicates in one or more contexts.

A Privacy “Tuple”
  • “Context” describes the current environmental constraints. This includes specific localities (e.g., my house) and functional localities (e.g., a medical facility or shopping center). Contexts can be arranged hierarchically; for example, a shopping center is a type of public place. Some examples are shown in the diagram below.
  • “Role” describes the privacy identity’s relationship to that context (e.g, whether I am acting as an employee, customer, doctor, patient, etc.)
  • “Predicate” describes the type of sensor or action; for example, video sensor, audio sensor, approach action.
  • “Constraint” describes the limitations we would like to place on the predicate to protect our privacy; for example, whether video sensors on devices in the context record my presence, or how close a “helpful robot” can get.
Context, Role, and Tuple Examples

By the way, if you’d like to explore these concepts in greater detail, have a look at Privaceum’s research articles on our website.

If we conceive of privacy expectations in this way, we can start sharing them with devices that are being asked to respect our privacy when we enter their environment. Devices are then capable of digesting our privacy expectations, processing them in light of the situation and other privacy identities’ expectations, and then selecting sensor and action states that most align with the expectations of privacy identities present at that particular time.

Tokens Aren’t Just for De-Fi and Art 😮

Reduced to their behavioral essence, the fungible and non-fungible token framework maps quite well to several of the entities and modalities in a blockchain-based privacy platform for creating and sharing these privacy expectations. With some adaptations, an NFT can be used to represent unique entities such as a privacy identity and a privacy ruleset.

Why Use a Token to Represent a Privacy Ruleset?

Privacy rulesets should be reusable. The reason is that, in most cases, a user’s context (derivable from where they are) and role (derivable from what they are) suggests, or sometimes even requires, certain privacy constraints. Some of these constraints are dictated by law (e.g. health privacy laws such as HIPAA), others by rationality or cultural expectation. Everyone doesn’t need, or want, to set up the rule tuples for these constraints themselves. Qualified entities can set up rulesets with collections of rule tuples pertaining to one or more contexts and roles. And, if designed reusably, those rulesets can be assembled into larger units, i.e., privacy identities. (We will see that a privacy identity is, in fact, an NFT that either owns or holds a “use license” to one or more privacy ruleset NFTs).

Reusability of rule tuple collections is beneficial not only in user convenience. Updatability is also straightforward when new laws or norms demand that constraints for a context be modified. Reusability of this sort is also blockchain-efficient in terms of storage demands and gas fees — instead of the same rule tuple being stored multiple times, it can be stored once by the ruleset designer and “used” by multiple privacy identities. The user only pays the gas fee to associate the ruleset to their privacy identity unit. (As an aside, a user will be able to modify rules provided by an associated ruleset by providing their own overrides, if desired; more on this later).

None of these design features, so far, entail a token. But reusability implies portability, it implies separability of ownership and responsibility for design and maintenance from mere use, and, now that there is a division between a doer of work and a user of that work, it implies an economic relationship. These secondary principles suggest a blockchain token framework for the ruleset entity.

Privacy Rulesets as a “Sort-of-an-NFT”

I propose a hybrid NFT with a royalty paying use model (a “Royaltized-Use NFT”) paired with an optional collective governance mechanism (a “DAO-Governed NFT”).

Portability and tracking of distinct blockchain entities (here, a ruleset, which is a unique collection of rule tuples) is the sine qua non of the non-fungible token standard (ERC-721). If privacy rulesets are designed such that they implement the ERC-721 interface, we get standardized semantics for the distinctiveness (“non-fungibility”) of each “ruleset NFT”, token transferability and sale, ownership tracking, and authorized operator management as part of the interface. We can also implement the optional NFT metadata extensions so that each ruleset NFT has a URI pointer to a JSON or other resource describing essential features of the ruleset, such as a hierarchical view of its rule tuples, maintaining organization, and standards/legal compliance information. We can also implement the enumeration extension so that privacy identities can “discover” all the available rulesets that can be reused.

That covers the NFT part of the interface. Separating ownership and the responsibility for design and maintenance from use — and the economic relationship suggested by that — requires additional semantics that are not part of the NFT interface or, indeed, any known token interface.

Join us in the next article in this series, Building a Royaltized-Use NFT, where we explore an architecture that supports a ruleset marketplace that includes usage royalty payments to ruleset designers.

--

--

Kevin L. Miller
Privaceum

Founder at Privaceum, blockchain enthusiast, inventor, attorney, author, and Microsoft veteran with 25 years of experience in software architecture/development