In 2019 we conducted a decent amount of experiments on the token economics of the protocol. Moving forward in 2020 it can be expected that we will continue to experiment and improve on the economics of the protocol.
We believe that over 2018 & 2019 we have identified and tested what are the most effective economic primitives. In this process, we have found that several of the concepts as described in the initial white paper, needed to be revised. Having found a solid fundament from which the token derives value, we believe the next phase is communicating our proposition to the open market.
This blog was originally posted in the December update blog:
Building a liquid market
Having a protocol token in the open market only has a function(beyond being a speculative vessel) if there is a liquid market for it. Proper price exploration will allow integrators to hold and use the protocol as they do with any other utility on their books. The primary objective of our token economics is to have a liquid market.
We are convinced that liquidity is achieved by focussing on 3 pillars in 2020:
1. Maximizing token exposure. Ensure that supply and demand for the token are able to find an equilibrium. It has become evident that our current exchange offering does not meet this standard, therefor additional listing has been secured.
Meaning: In Q2 we will add a major exchange, pending on certain factors.
2. Providing provable data. In the traditional financial markets, very little of the action is fueled by the contents of a blog. While quarterly reports are full of words, these words are context, in the end, it is the cold hard numbers that make the difference.
Meaning: In 2020 we will push to ensure that both supply as demand for GET can be determined purely by data present on the blockchain.
3. Simplify & Decentralize. Complex systems scale horribly. If only a select few are able to figure out what is really going on, a protocol is unlikely to be trusted for anything more than gimmick features.
Meaning: In 2020 we aim to greatly reduce the amount of ‘moving parts’ in the economics of GET.
GET Scarcity & Burning policy
Mid-2019 we announced that burning would become a crucial part of the GET economics. At the moment of writing (December 2019) we are still in the process of optimizing and improving the details of the burn mechanics.
We have set that by 2022 50% of the current total supply will be burned. To ensure this will be the case we designed a dynamic burn mechanism. This ensures that the burn ratio and GET sourcing will be algorithmically determined to hit the scarcity target in 2022, reducing the centralized distribution of the token as caused by the initial minting.
The aim is that from 2022 on forward the protocol will be unrestricted and decentralized. In other words, from that point, we aim for GET to be assigned a governance role as well as the current utility function/role.
Dynamic burn ratio?
The dynamic burn ratio determines how much GET is burned per transaction. At the moment the amount of GET burned per ticket state change is can ‘feel’ rather arbitrary. By making the burn model public and only using public verifiable data as input it will be possible to follow the logic of how much GET was burned and why.
Predictable supply mechanics
We suggest that if traders/users would be able to model possible future scenarios and imply these to the supply of GET(within a reasonable range of certainty). Their ability to determine the spot value is drastically improved. We believe that such mechanisms will prove instrumental in creating a liquid open market. In a similar manner, forex investors prefer currencies with policy-consistent central banks backing them. Say what you want about maths but there is nothing more consistent than a formula.
More details on the burn mechanism will be made public as soon as they are fully finalized.
For those less interested in the nuts and bolts of the token economics, not to worry. The high level mechanics of the GET Protocol remain unchanged. More transactions still means more GET burned. This has not and will not change.
Ticket interoperability 2020
At the heart of the GET Protocol, it’s the ability to make tickets truly digital. Having a fast and transparent data layer to store the relevant ticket states is a key requirement for the platform to truly scale. Why is that? Because without a clear way of checking, registering and propagating ticket changes of a ticket there will be no consensus about the current state of all tickets. Without a standard, it is not possible to track tickets distributed by different sales funnels. This fragmentation creates huge inefficiencies for both issuers and buyers.
More funnels, more $ale$
Digitization of tickets brings along a wide array of benefits; less scalping, fraud and better fan data. However, the client making the buy decision will always prefer the system that boosts their ability to sell more tickets. Artists might tweet with their hart, but mo$t of them $ign ticketing contract$ with their wallet.
Issuance of tickets on different platforms (and by different ticketing companies even) is very common in the industry. Something we have witnessed ourselves during ADE.
A recap of our first Amsterdam Dance Event
Selling 11.000 tickets to people from 25 different countries for 12 events in 4 days.
For the protocol to have mass-market appeal, being able to process, track and monetize tickets from different funnels is crucial. Including tickets that we would classify as ‘dumb’. This does not mean we are compromising on ticket security, we are merely allowing less secure tickets to be tracked alongside smarter tickets.
Luckily there is a big downside for ticket issuers when using multiple funnels with ‘dumb tickets’. After the QR codes are issued to a funnel, they cannot easily be revoked, changed or interacted with. It is like exposing your private key in public, even if you manage to delete the key quickly. You can never be sure it wasn’t stored, until its too late.. Take for example this instance:
Ticketmaster denies mass ticket-scalping reports as ‘categorically untrue’
Ticketmaster is calling reports of collusion with ticket-scalpers “categorically untrue.” CBC News and the Toronto Star…
There is no way of knowing who is right or wrong in this Ticketmaster case(although I would dare to take a bet). Due to their reputation, everything they say that requires the public to trust will not be believed. What if they were telling the truth all along?? THE INJUSTICE!!! probably not though lol.
No party can make any meaningful claim in this case. The ‘dumb QR codes’ cannot be tracked as they are essentially naked private keys. We cannot figure out who sold the ticket and from whom this ticket originated. This situation favors those that like the shadows and doubt. It draws out bad immoral actors. Let me introduce you to; the ticketing industry.
Back to reality, ‘disagreeing with capitalism’-as a service never really took hold with the mainstream. So we Duch socialists better not finger-wag too long to the big bad wolf called ‘driving shareholder value’. If we want to improve the way tickets are issued we better $peak Benjamin$.
Let me try.
More transparency, more sales
The key idea of the GET Protocol is to standardize the way we register a ticket changing state. Regardless of the back-end of the ticketing company. With such standardization in place tickets of a single event can be propagated to funnels, without losing control. Tickets of an event could be issued on multiple funnels, with only the necessary state changes about the fundamental state of the ticket being propagated (so no sensitive information companies do not want to share for competitive reasons).
Finally; the sound of the cash register
The GET Protocol ticket transparency add-on will allow ticket issuers to track tickets of an event across several funnels. Allowing them to optimize funnels as they go. In addition, they are able to effectively interact with the current owners of a ticket.
Every hour Stoolbox registers a batch of ticket statechanges to the blockchain.
There is no information to be extracted from only analyzing a single ticket mutation in such an IPFS batch. Only when the complete history of the graph tree is downloaded and analyzed is it possible to determine to draw conclusions. This iterative process of crawling the IPFS batches and building a state-tree is conducted by the ticket explorer.
The work on achieving this interoperability between ticket inventories is still ongoing. There are a lot of problems to solve and tools to be built. By open-sourcing the GET Protocol standard as well as all the ticket explorer code we aim to instill a open source developer community.
The GET Protocol Ticket Standard
The first documentation and specifications on how the GET Protocol registers events, tickets and more will be released by the end of Q1 2020. This technical specification will allow anybody with an internet connection to verify the ticket data as made public by the GET Protocol.
In order to effectively demonstrate the type of transparency we are providing, the GET Protocol Foundation will open-source the first iteration of the ticket explorer. Both the open-source code base of the explorer as expected to go into production towards the end of Q1.
“Let the hashes do all the talking.”More about GET Protocol
More about GET Protocol
Any questions or want to know more about what we do? Join our active Telegram community for any questions you might have, read our whitepaper, visit the website, join the discussion on the GET Protocol Reddit. Or get yourself a smart event ticket in our sandbox environment. Download the GUTS Tickets app on iOS or Android.