Ostrom’s Design Principles
Applied to Data (Part 1 of 2) in our research article series by Fractal’s Chief Economist Aurel Stenzel
In 2009, Elinor Ostrom received (as the first woman in Economic Sciences) the Nobel Prize for her work that demonstrated how common property could be successfully managed — without regulation or privatization. She primarily focused on typical common goods like forests, fisheries, oil fields, or grazing lands. Data, however, carries very particular characteristics (see our previous blog post on anti-rival goods) which leads to the question of how Ostrom’s work can be applied to data sharing situations. In this blog post and the next blog post, we will give an introduction to her influential design principles and apply them to the special characteristics of data (1).
1. Clear User and Resource Boundaries
The first design principle requires a clear differentiation between legitimate users and non-users as well as a clear separation of the specific common-pool resource from a larger social-ecological system. For example, the names of farmers who are allowed to use the land need to be declared. Farmers can be easily identified (by just knowing them in a small village or by publicly available IDs). Contrary, on the web, identities can be easily obtained, changed, and also faked. Fractal ID already offers a blockchain-based solution for the identity problem on the Web 3.0. The ability to identify users of a common resource is crucial for most of the design principles to follow. Important remark: the ID of the user does not necessarily need to be the same in all of their activities if we apply cryptographical innovations like blind signature. Ergo, the anonymity of the protocol is not violated by this design principle.
Excursion: Blind Signature — an analogy to carbon paper and envelopes
Alice and Bob know each other well. Alice sometimes likes to carry a different identity (say, Charly) and also wants Bob not to learn this fact. She writes her name “Alice” and her signature on the outside of an envelope and writes “Charly” on a piece of paper which she puts in the envelope together with a piece of carbon paper. She sends the envelope to Bob. As the envelope remains unopened, Bob cannot read what Alice wrote inside the envelope. As he recognizes her signature, he signs the outside of the envelope. Because of the carbon paper, his signature bleeds through onto the piece of paper with Alice’s new identity Charly. Bob sends the (unopened) envelope back to Alice who has her new identity Charly now verified by Bob. Ergo, Alice can leverage her existing relationship for her new identity without Bob learning anything but the fact that Alice acquired a new ID.
Also for data, it is very hard to define boundaries. Data can be stored and replicated at almost no cost. Even if a datapoint belonged to one data pool at a certain point of time, after it has been shared, it may belong to many pools afterward. In our most recent blogpost (Data Privacy As a Prerequisite), we describe how data sharing decisions of one user impact other users. Boundaries are very blurry and difficult to obtain. If we want to turn data into a private or common good, we need to apply cryptographical tools (also see From Data To Information (and back)). Without such measures, this design principle cannot hold.
2. Appropriation and Provision Rules
Provision rules define necessary activities to maintain and nurture the resource while appropriation rules define how the common resource is used. Both rules need to be congruent with the local social and environmental conditions. For data, this means that all users of the resource (e.g. a data union) need to be aligned with who provides what data and when and how this data can be used. The benefits for the users which data is used need to be proportional to the costs of the user for providing the data.
3. Collective Choice Arrangements
In this design principle, Ostrom requires that the people affected by a resource regime are authorized to participate in making and modifying its rules. So far, we have not been able to participate in decisions about how our data can be used. GDPR was an important first step but is rather considered to be annoying (remember the last time you clicked “accept” just to make a cookie pop-up quickly disappear) with questionable impact. Norberg et al. (2007) experimentally showed a discrepancy between individuals’ intentions to protect their privacy and how they actually behave on the web (2). There are different explanations for this so-called privacy paradox, e.g. that people have difficulties associating value to their data (and therefore do not see a reason to protect it), that people do not consider certain data to be their own or that people assume that the internet platforms already know everything anyway (also see our most recent blogpost (Data Privacy As a Prerequisite). Fractal aims to tackle those challenges at once. By offering the possibility to make sophisticated decisions (considering all positive and negative externalities), Fractal gives decision power to the people who are most affected by those decisions — the end-users. This design principle makes me very hopeful that we will be able to overcome the privacy paradox in the near future.
Within a cryptographically protected data union, we can apply all three design principles. A group of identified users (clear user boundary) set their data sharing rules, e.g. which data can be shared with whom (collective choice arrangements on appropriation and provision rules) and for which price (proportional benefits). As the data is cryptographically protected, the shared data cannot be replicated, stored, or reused (clear resource boundary).
In our next blog post, we will further apply the remaining design principles to data and especially discuss the importance of functioning monitoring mechanisms.
(1) For a detailed introduction to the design principles, please have a further read here: “Ostrom, E. (1990). Governing the commons: The evolution of institutions for collective action. Cambridge university press.”
(2) “Norberg P., Horne D., and Horne D. 2007, The Privacy Paradox: Personal Information Disclosure Intentions versus Behaviors.”
About Fractal Protocol
Built on Polkadot, Fractal Protocol is an open-source, zero-margin protocol that defines a basic standard to exchange user information in a fair and open way, ensuring a high-quality version of the free internet. In its first version, it is designed to replace the ad cookie and give users back control over their data.
Make sure to -
This article does not include elements of any contractual relationship. This article shall not be deemed to constitute a prospectus of any sort or a solicitation for investment or investment advice; nor does it in any way pertain to an offering or a solicitation of an offer to buy any securities in any jurisdiction.
For the avoidance of doubt, please note that the Protocol has not been fully developed. Any statements made about the Protocol are forward-looking statements that merely reflect Fractal’s intention for the functioning of the Protocol. There are known and unknown risks that can cause the results to differ from the forward-looking statements.
Fractal does not intend to express investment, financial, legal, tax, or any other advice, and any conclusions drawn from statements in this article or otherwise made by Fractal shall not be deemed to constitute advice in any jurisdiction.