Oasis Network AMA: Data Tokenization with Richard Whitt — Chief of Digital Stewardship at the Oasis Protocol Foundation

Tina Xu
Oasis Foundation
Published in
8 min readNov 5, 2020

The Oasis Network “AMA Session” is a Q&A session, which happens virtually on the Oasis Network Community telegram group. This session has been designed to occasionally bring every community member up to speed on recent developments relating to the Oasis Protocol, and also create an avenue where every member’s questions and doubts, can be addressed.

This AMA took place on November 5th, 2020. In this session, Richard Whitt — Chief of Digital Stewardship at the Oasis Protocol Foundation — joined us. To stay up to date and join our next AMA please join our Telegram Community.
https://t.me/oasisprotocolcommunity

Jon Poole — Hi everyone and welcome to another Oasis AMA! Today we’re taking questions about the recent launch of the Oasis Data Tokenization White Paper.

https://medium.com/oasis-protocol-project/secure-data-tokenization-7b730357b03e

We’re joined today by Richard Whitt the author of the paper, to answer questions about the paper, and how the Oasis Network can enable Data Tokenization.

Richard Whitt — Hey folks. So pleased to be here today. I hope everyone is doing well.

I’m the head of digital stewardship with the Oasis Foundation and a former policy guy with Google.

Looking forward to chatting about secure data tokenization!

Jon Poole — I’ll post questions to guide the conversation. Feel free to hop in and ask follow up questions.

Okay, first question. What exactly is data tokenization?

Richard Whitt — Sure thing. The concept of tokenizing data is not entirely new. The “traditional” method aims to substitute one type of data element — a token — for another one — the actual datasets. The original data is typically some form of information that the user considers sensitive and so wants to protect.

Jon Poole — What is different about “Secure Data Tokenization?”

Richard Whitt — Excellent question. With secure data tokenization, the dataset is not treated inherently as one thing or another, sic as having a degree of sensitivity from the user’s perspective. Instead, the dataset is wrapped in a standardized and composable atomic unit of data, called a Data Token. (oops, should say “such”)

Each data token can have its own specified and unique set of properties — such as a standardized format, interoperable capabilities between networks, and an ability to program the DT for certain sharing capabilities.

So that’s the first piece. The second component is a secure and confidential computing capability. Some also call this a TEE — a “trusted execution environment.” The folks at Oasis refer to this as a “secure enclave.” Among other things, this security layer protects the data token from outside attacks and attempts to discover the underlying datasets.

So, it’s the data token, plus the secure enclave environment. The combo makes a Secure Data Token.

Jon Poole — Where can I find the White Paper if I want to read more about Data Tokenization?

Richard Whitt — So glad you asked! 😊 Here is a link to the paper: https://docsend.com/view/isrhqk352adykdpz

I also wrote a short accompanying blog post: https://medium.com/oasis-protocol-project/secure-data-tokenization-7b730357b03e

Jon Poole — Okay, next question. 💪🏼

How did we get to this point? What’s the evolution being Data Tokenization?

Richard Whitt — Well, I’m old enough to remember a lot of this history. It’s actually a case of returning back to the roots of the Internet, and the Web.

The original Internet of the 1970s included some amazing design principles, including the end-to-end principle. That meant the power and control resided at the edge of the network, with ordinary users. Then the World Wide Web came along as an overlay to the Net. With “Web 1.0,” in the early 1990s, we saw the introduction of the client-server arrangement, which made the Web a far easier place for the average user to utilize the resources of the Internet. However, the “client” side began to cede some power and control to the “server” side of that connection. The e2e principle remained in place, but the client-server overlay was shifting away from the edge.

Then, with what some folks call “Web 2.0,” the multisided platforms took over — Google, Facebook, etc. — and the Web “user” gained some amazing new capabilities from the cloud. At the same time, however, there was a tradeoff: the user lost further control over the privacy and security of her data, from her place at the network edge.

Many folks now are touting the dawning blockchain era as “Web 3.0.” Using distributed ledgers and other tech capabilities can bring back to Web users at the network edge much of the power and control over data and other compute functions that first was unleashed by the Net’s e2e principle. We think secure data tokenization can be a big part of that new era.

So, back to the future!

Jon Poole — What is Blockchain 3.0 and how does this relate to Data Tokenization?

Richard Whitt — So, just a bit more history for your question about Blockchain 3.0. 🙂 What we are calling “Blockchain 1” is the era of cryptocurrencies, where tokens serve as a fungible form of money. Bitcoin is a classic example.

Blockchain 2 was the emergence of Ethereum and the notion of using tokens to represent the value of certain assets.

We at Oasis believe that Secure Data Tokenization is a dynamic shift forward, as Blockchain 3.

Jon Poole — How can the Oasis Network enable data tokenization?

Richard Whitt — We believe Oasis has the unique capability to provide both parts of the secure data tokenization capability that I mentioned at the outset: tokenizing the data, and making it secure. While there are other technical elements that will be needed to make SDT a more ubiquitous reality — such as differential privacy and federated learning — the Oasis network is poised to be a difference-maker in advancing SDT.

Jon Poole — What are some interesting use cases?

Richard Whitt — As the white paper explores, there already are a number of fascinating use cases for secure data tokenization. Two, in particular, are tokenizing and securing human medical data, including personal health information (PHI), and aggregating genomic data for various research purposes. In both cases, highly sensitive datasets can be encapsulated with tokens and brought into a secure computing environment.

We’ve actually been asking the community some questions about use cases, in the Oasis Community Cub Challenge.

We already had some great input in our first two weekly Cup questions about the value of secure data tokenization — so thanks for that!

Jon Poole — What more needs to be done to make DT a reality?

Richard Whitt — As I mentioned previously, there are some important technical components that can help make secure data tokenization a reality. These include cutting-edge tech like federated learning, differential privacy, and zero-knowledge proofs. Beyond that, however, we need to help explain to people why secure data tokenization is such an important advance. There may be concerns about somehow losing control over one’s data, for example, or the cost-benefit ratio of adopting the new tech, or the scalability. We believe each of these concerns can be addressed upfront, however, and shouldn’t be seen as obstacles to the deployment and adoption of SDT.

Jon Poole — What is the Oasis role in making SDT a reality?

Richard Whitt — Oasis Foundation is excited to be a leading voice in support of secure data tokenization — what we think of Blockchain 3.0. We welcome the involvement of the community in spreading the word and engaging with us on developing use cases and the like. Stay tuned for more news coming soon!

I can’t help to think about secure and fast and fair voting and could oasis help in facilitating this

Jon Poole — 🔥 Okay, the last question and then we’ll open it up to the community for a Q&A! 📣

Where should I go if I’m interested in building an app about Data Tokenization?

Richard Whitt — Excellent thought, Ron. The whole voting process, whether in real life or on the blockchain, could be made much more secure and timely with this kind of tech.

Thanks for the great questions.

We just announced a new Hackathon on Gitcoin! You can build Apps on the Oasis Eth ParaTime, or using our Parcel SDK.

We encourage you to check it out and sign up here: https://gitcoin.co/hackathon/oasis/onboard

Jon Poole — Thank you, Richard, for taking the time this morning and sharing your incredible insight into this new secure platform for Secure Data Tokenization. 🙏🏼 We appreciate it 🌹🔥

If anyone has any questions for Richard, now would be the time! ✋

Telegram User CheE — Did I understand correctly that my specific information can become a token? If so, then it will be possible to create a pool of this information from different people and this pool will be in demand for any purpose.

Richard Whitt — Sure thing — I love talking about these kinds of cool moments in tech. Secure data tokenization hods such amazing potential. I hope folks are interested to read up more and interact with us.

Great question CheE. And yes, any datasets can be combined from any sources to be tokenized and then put into a secure enclave. This really frees up the underlying value of the data for sharing, re-use, monetization, etc.

The data can also stay where it is, and the token becomes mobile. That also makes for more secure and privacy-supporting treatment of sensitive data, like biometrics.

Telegram User — In terms of data tokenization and decentralized data marketplaces, do you foresee automated data quality controls being established?

Or should data quality control be left up to the consumer of the data, or perhaps managed by the marketplace (similar to current centralized data marketplaces)?

Richard Whitt — Thanks for the question, Irvin. The notion is that the data user can define for herself how she wants the data to be encapsulated, and the metadata shared out. The quality controls you mention could be part of the policy of the token, or it could be part of the external environment (like a marketplace). There is considerable flexibility for the data user.

Telegram User — many applications come out on Ethereum. Oasis can be a monopoly in data tokenization applications.

Richard Whitt — I agree, CheE that Oasis can play a pivotal role in facilitating a whole new breed of DT apps.

Telegram User — Thank you for your insight, Richard!

Telegram User — thanks! this is really cool

Jon Poole — This has been incredible 🚀, thank you again Richard 🙏🏼

Richard Whitt — Thanks, Jon. This has been great fun.

This has been my first official AMA. 🙂 So I’m happy to return and chat some more about this cool new evolution of distributed ledger technology.

Jon Poole — Thank you

--

--

Tina Xu
Oasis Foundation

I love all things data. I'm a trained professional in marketing and business who fell in love with the world of data analytics and emerging technologies.