Data Revelation, the Missing Piece to the Adoption of Web3 Privacy

Gavin Thomas
Obscuro Labs
Published in
5 min readSep 22, 2023

Imagine a world where privileged agencies have access to private data where others do not. Where secret algorithms push curated information. Where financial models can only be run by those with permission.

In this blog post I’m going to explain why cryptography and private data alone is not enough for Web3 to go mainstream and why it needs to be coupled with time-based data revelation so honest and dishonest users can be distinguished and data validation can be decentralised (spoiler alert: it does not require zero-knowledge proof).

LIBOR: A Story of Manipulation and Collusion

First, a short story of manipulation and collusion which sent shockwaves through the world’s financial system: the LIBOR scandal. LIBOR was the London Interbank Offered Rate and served as a globally accepted key standard, or benchmark, interest rate for mortgages, loans, credit card payments, car payments and other financial offerings to regular people. So, quite important.

The fundamental failure of the approach used to set LIBOR was the reliance on the expert judgement of individuals. Relying only on trusting human beings, who are prone to being influenced by emotions or a personal agenda, is a flawed approach to derive highly trustworthy information. Ideally, and presumably this was the original vision for LIBOR and other financial models, data collection from various honest sources will provide accurate information. Only that is not what actually happened. In reality there was widespread misconduct with individuals motivated to move the LIBOR in a particular direction for personal gain.

Trust the Blockchain

What’s the solution? A decentralised blockchain like Ethereum has been lauded for years as a neat way of solving the problem of setting benchmarks. The risk of human collusion is reduced by simply removing the meat popsicles from the process and the data, being immutable, can be considered trustworthy. Smart contract code uses the immutable data stored on the blockchain to provide a trusted benchmark. No human interaction. No dodgy data. Nice!

The Fifth Element. Columbia Pictures.

Trust the ENCRYPTED Blockchain

However, this only partially solves the problem. With Ethereum being a transparent blockchain, all data is viewable including the data used to set the benchmark. This means the collusion risk is not fully mitigated: humans can still intentionally move the benchmark in their favour for tomorrow by submitting specific transactions today. To iron out this wrinkle we need to consider how the benchmark data can be hidden. For that we use encryption.

It would be impossible to know how to successfully manipulate the benchmark. It would be a shot in the dark.

Encrypting the data which contributes to the benchmark prevents intentional manipulation because the data is unreadable. It would be impossible to know how to successfully manipulate the benchmark. It would be a shot in the dark. Potentially a very expensive shot. The lack of certainty of a successful outcome serves as a strong disincentive to even try. No human interaction. No dodgy data. No attempts at manipulation. Great!

Trust and VERIFY the Encrypted Blockchain

But there remains one final niggle. Remember how the vision for LIBOR was to use honest and reliable sources of data? If the benchmark data is hidden how can you check it’s accurate? To fix that, the collected data needs to be revealed to anyone who wants to see it. For something like LIBOR that would mean the data is revealed after 24 hours along with the calculated benchmark. For those who want to be sure the rate is indeed fair and honest, they can check the data sources and replay the calculations.

We end up with three components to achieving honesty and fairness in Web3:

  1. an immutable decentralised source of data (no human interaction or dodgy data).
  2. encrypting the contributing data and having the means to make calculations using that data (prevents tampering).
  3. revealing the contributing data at the right time (anyone can check the accuracy of the calculations).

The Power of Revelation

Once we start thinking about data encryption coupled with time-based revelation we realise it’s a powerful concept: private data, but for only as long as required. But this is not new. Here are some examples of how long it takes for private data to be revealed:

Why are the world’s biggest secrets made public after 25 years? Accountability. There has to be a point in time where the data, and how it has been used, can be scrutinised. This is crucial to establishing and maintaining trust in a service. Revealing information allows the honest and the dishonest to be determined.

Right now in Ethereum everything is transparent. This promotes the original principle of openness. The Obscuro project (Now known as Ten) is using secure enclaves to encrypt Ethereum and, crucially, time-based data revelation has been baked into its design from the very beginning (full disclosure: I contribute to Obscuro).

Help you be confident the entire ecosystem is treating you fairly

Applications on Obscuro enable a vastly more exciting and engaging Web3 ecosystem and help you be confident the entire ecosystem is treating you fairly: you can check that the result of a game is fair. Or that you were not the victim of a sandwich trade. Or that your payment was completed successfully. Or that the winner of your blind auction put in the highest bid. Revelation of the data is a strong deterrent to dishonest users because at some point their actions will become known to everyone, including law enforcement agencies. Which can only be a good thing for Web3.

--

--