Content scrutiny — How to avoid a digital dystopia

Anupam vashist
Karma and Eggs
Published in
7 min readMay 10, 2018

Alternate approach to a cleaner internet

Consider this- We have abundance of content on the internet. News, articles, music, videos, art and what not. A lot of this space is filled up with meaningful and informative content which we, as a community, can deem “worthy” of existing on the internet. Still, in an hour of session on internet surfing, there is a high chance of us stumbling upon tens of minutes of ‘Viral’ hogwash in the name of free speech which is not at all informative and in many cases, misleading and outrageous. We can not even label it worthy of existing on the internet as community, still there is truckloads of useless content eating up so much disk spaces all over the globe.

Magic happens where data happens to be. Control?

Is it an epidemic? yes. When important issues need our attention, our headspaces are filled with this garbage. Our important time is being taken up and nobody is talking about it. What if we could somehow get the scrutinised (read censored) out through a mechanism that understands scrutiny w.r.t opinion of the majority of community- kinda like democracy on the internet. If you get what i am trying to say, read further.

Abstract:

DemocratX is a game theory and distributed storage based approach to ease content generation, validation, distribution, rights and usage for a peer driven digital economy. The content can be shared directly or further curated and distributed to willing users by a second layer of distributers who can (and should) on one end, own some rights in the content in return of easy access to the author and handling duplicity- on the other end, charge the usage of content by subscription or pay per view.

In essence, DemocratX is a publicly validated content generation, storage and distribution framework which can be used to create independent and forward-functioning economy in return of concerned services provided by authors and distributors.

1. History:

The internet as we know it started from peer to peer network between machines. Now, over the time the model has shifted with certain nodes becoming so important that they have control over a huge chunk of internet data and traffic. This has led to monopoly of a few players over the internet which in turn is providing unfruitful for evolution of economy of internet as whole given that the resources are only owned by a few.

Entry barrier for a new player in the economy is too high to let the quality content flow and in the pressure to sustain and make money we have witnessed the downfall in quality of content that floats the internet in the name of fake/irrelevant news or Viral content, no doubt everyone has tried hard but how much of strength of one can outweigh strength of the world (Bitcoin)

Then there is a problem of plagiarism and content censorship- not in terms of artistic expression, but the quality of content. Here in this paper we will discuss these 2 problems and propose a possible solution in terms of collaborative- incentive based architecture that has potential to crank up a whole economy based on the product usage.

2. System design

The architecture is a p2p network of authors, valuators and distributors, each having an access portal (similar to a wallet) and each maintaining their role depending on the action that is being taken by the peer. As suggested and established in blockchain technology (specifically from Bitcoin protocol), the content is contained in a content snippet, containing previous contents hash, author and valuators signatures, and content’s address (or the actual content itself). All the verified content is stored in block chain that keeps it in a time-stamped fashion along with associated tags and relevant information. Below we will see the additional elements we will add to ensure the authenticity and quality of the content.

3. Actors in the system

Layer 1:

Author: Peer or peers that own the creative and selling rights of content.

Valuator (Pv) : Set of N Peers who digitally sign the content and label it as fit or unfit for the network. At the end of voting, if the quality threshold (50 to 70 % ) is passed, the content is pushed to layer 2 by a random peer selected. This layer is responsible for maintaining content quality.

Layer 2:

Distributors (D): These are the go-to nodes in the network whose responsibilities include:

1. Acting as easy access gateway to network for connected Pv and authors

2. Provide curated content to valuators as well as end users by selecting relevant content.

3. Check copyright infringement (authenticity) or duplication of content using content ID

4. Acting as local showcase and selling point of content. Any revenue made by Ds dribbles down to authors as per selling rights signed.

5. Act as full nodes in a network, containing copies of blockchain to make sure that only the original repository sustains.

Layer 3:

This is an optional yet necessary layer that needs to exist outside the network. Let’s call this Aggregator layer and their role is to buy content from Distributor layer and forward it to to end users outside the network. Aggregators can charge users on subscription or pay per view basis- completely open to use any way they wish to use the content until they are paying the distributor for their content. This payout is distributed among Distributors and Authors as per ownership rights of content.

4. Stages of content publishing:

Content Production:

An author peer, before pushing the content to network, provides associated information like Author/s names, keywords to search, percentage ownership of content among various authors or any other relevant information required depending on content type. Author has the option to:

1. Directly push the content to the network. This is hashed with author peer’s private key and pushed to an unverified snippet cloud, containing all other unverified content snippets hovering in the network, waiting to be verified by at least N peers.

2. Otherwise, the author can approach an access provider, who takes a % ownership in the content for its visibility and services, and post the content. This is hashed with provider’s private key and pushed to an unverified snippet cloud.

In any case, the content will be pushed to the cloud, waiting to be verified.

Content validation:

Again, the valuator peer has an option to directly access the unverified content via core api, however, this is an inefficient way since the valuator will be bombarded with random content it might not have any opinion about. This will only work better if the valuator has a technical know-how of selecting only the content having relevant keywords or demography.

An efficient option is to reach out to access providers. Since the goal of any provider is to publish as much content as possible, the providers will segregate the snippets on basis of keywords and show only curated content to valuator with a high chance that there will be an opinion.

Valuator peer Pv has an option to connect to any distributor D present in D list. D will provide curated content to Pv to verify. Now, since the content in the unverified snippet is hashed with D’s private key, anytime the content has to be accessed, it needs to flow through D’s account by forwarded API.

Consensus on content validation:

The core driver in this system is to achieve best response equilibrium (Nash equilibrium) against 3 options provided to valuator: Validate, pass, or dismiss the content based on content’s quality. This can be done by setting payoffs to each action in such a way, that the collective swarm is inclined to reject/accept the content with a consensus.

Default strategy says that each peer accepts the content if it is well within the guidelines set by the economy and rejects if otherwise. Payoffs need to be set in such a way that changing this strategy leads to penalty or lesser payoff. A peer rating (percentage of correct opinion) is also maintained to make sure the repeat offenders’ opinion matters lesser and honest mistake by a high rated peer doesn’t cost much.

Payoff matrix is designed as:

Ri = Rating of a Pv

n+ = Number of Pv having positive opinion

n- = Number of Pv having negative opinion

5. Content rights and distribution:

Content rights are set by the author in content production phase. Since the contents are hashed with Distributor’s private key, it can only be accessed from Distributor’s portal API. In a special case where Author is himself the distributor, all the distribution goes through Author’s portal API. In any case, any kind of user- be it an end user directly approaching the network or through a Layer 3 Aggregator- will have to get through either author or Distributor to get the content. This ensures content is not being used out of the system without consent of authors. Additionally, it eases payment for the content by pin-pointing the value exchange.

6. Usage

We are working on application of this architecture as an answer to Fake news and propaganda (NewX).

In this case, the Layer 1 are independent journalists and layer 2 are news distributors or mouthpieces. As suggested above, journalist who gets the news firsthand pushes it to the network where other journalists verify the news as Pv whose actions are incentivized positively and negatively depending upon their opinion.

Now, since the news isn’t an opinion but a fact, only the journalists who could verify the news will be able to get a positive payout. An unsure verifier won’t be able to “guess” the validity since the negative side of payout includes loss of (We are proposing digital tokens as reward) digital currency as well as permanent damage to reputation Ri. A better strategy is to pass the news item which a peer can’t validate himself. On a macro scale, the default strategy becomes — To verify the news if completely sure about it.

Another usage we are working on is MusX- Another application that is centered at overhauling music industry for good.

In this case, Layer 1 consists of musicians and layer 2 consists of Record labels. With a low entry barrier, distributed traffic and usage of IPFS, it empowers artists and music enthusiasts to easily spread music while making sure that quality of music is never compromised.

References:

  1. u.arizona.edu/~mwalker/10_GameTheory/RepeatedPrisonersDilemma.pdf

2. economics.utoronto.ca/osborne/igt/nash.pdf

3. Bitcoin whitepaper: //bitcoin.org/bitcoin.pdf

--

--