Secure Data Tokenization
Further exploring CryptoData with the Data Tokenization white paper
Community Translation: Brazilian Portuguese| Russian | Spanish | Filipino | Turkish | German | French |Filipino|Chinese|
In mid-September, we published a short blog posting describing the Oasis Foundation’s vision for a responsible data society. At its core is the concept of using a blend of advanced distributed ledger technologies to give human beings much more control over their own data. These technologies combine two foundational elements: the public and easily-shareable nature of tokenized data, with secure computing environments. The end result is the capability to produce a new form of cryptodata.
It’s been gratifying to hear from many a growing sense of interest and excitement in this vision (including from hundreds of our ongoing Community Cup participants). As a result, we thought it would be suitable to provide some additional details about theses technologies, and the significant market and societal implications. Today, we are pleased to publish the Data Tokenization white paper, supplying some of those details.
The white paper further explores a sophisticated blend of advanced technical elements, resting on tokenized data and secure computing environments. These core elements are augmented by other tools, such as differential privacy, federated learning, homomorphic encryption, zero-knowledge proofs, and secure multi-party computation.
The benefits for both data consumers and data producers is clear — including tighter controls and security over access to one’s data, more granular privacy protections, and greater sharing and re-use options. In short, in a world where one’s data is more private and secure, yet also more accessible and shareable, users can gain programmatic data rights, including the ability to attach monetary value to flows of data. The new value paths engendered by these innovations can ignite a wide range of economic and non-economic benefits for society more generally.
Already we are seeing clear signs that the advent of tokenized data is being welcomed by users. As the paper explains, human health data and genomic data are two excellent use cases. Highly sensitive biometric data from patients can be collected, aggregated, and made available to researchers and others for analysis — while the underlying datasets remain fully secure and protected.
To fulfill the immense promise of these technologies, a number of steps remain before us. These could include: further developing a digital stewardship agenda (including a new edge-to-all (e2a) design principle); exploring additional use cases; educating policymakers; and initiating a tokenized data ecosystem of stakeholders.
We invite you to read our Data Tokenization white paper, and welcome your own thoughts as well about concrete opportunities to advance this new vision for defining and bolstering a more responsible data society. To join the discussion visit our Telegram channel.