TokenEngineering meetup #2 — lookback

Balázs Némethi
Taqanu
Published in
6 min readJun 1, 2018
Final Board

May 28th marked the day of the Berlin TokenEngineering meetup was hosted by OceanProtocol. #TokenEngineering is an emerging term in the blockchain ecosystem that involves an engineering-oriented approach for incentive mechanism designs. The term was first used by Trent McConaghy, a renowned Ai researcher, entrepreneur and blockchain pioneer who, together other fellow blockchain technology enthusiast realised that with the current, already existing knowledge we should be able to set up a framework to create, actually engineer, new incentive models using blockchain technologies.

“Get people to do things”

This post is a personal experience about this meetup as Taqanu was one of the analysed project’s.

TokenEngineering meetup kickoff moments.

The meetup’s structure was simple. Trent opened the show with a 10 minute quick presentation (without slides) on how TokenEngineering came to light and what we can expect from its evolution in the future. If someone is interested to learn more about the project visit — http://tokenengineering.net/building-blocks or follow the #tokenengineering hashtag on twitter. (Meetups are already in more than 3–4 countries)

Following that, two participants were offered the stage to present the outlines and most important characteristics of their project. Taqanu presented the architecture for a horizontally scalable mission-critical identity system architecture and another, yet nameless project, about how prediction market’s with a twist could be the base for the global impact investment framework. (the project was presented by Elad — contact him for more details).

Elad presenting

As following our presentations the group of 40ish tokenengineer split into two groups joining either project.

group Taqanu at the second TokenEngineering meetup (behind the glass there is the other group on the left)

The group joining Taqanu included Trent so we could experience it first-hand what he envisioned when starting this movement. He was only setting the framework and leading the conversation to not get stuck on less-important details and push token frameworks to get discussed along a structured method.

The overall method consisted of three key elements.

We can approach token design as optimisation design.

  • To formulate the goal (problem) of the platform
    By answering the question of what does this platform wants to incentives.
  • Try already existing patterns
    Fix objectives and constraints
    As many as possible then reduce the number to the most important ones. Keep adding extra ones as some might only come to mind at a later stage
    Elaborate on the basic patterns
    Find out if it fulfils all predetermined objectives and constrains by quickly going through in a mind-game like session.
  • Alter, improve and change the token mechanics (design new mechanics)
    To the point when it fulfils all points.

The original 3 step can be found at Towards a Practice of Token Engineering4.3 Token Methodology described by Trent.

This methodical evolution of the token model gives the ability to create a structured dataset of a range of slightly improved (modified) versions of the basic model.

Section of working board

Through this token engineering method, we have evaluated 10 different types of token models developed on plain sight that gave us a thorough overview on what direction is best to take. Due to time constraints, we did not finish the modelling as It was turning very late and discussions on nuance like details became more in-depth like analyses of the issues.

The end result is a mesmerisingly simple straight forward table where the advancements of the model can be easily reverse tokenengineered. (pun intended)

Short Recap on the 10versions and a visual representation of the evaluations of elements.

Before jumping into the method, we went through how the architecture is working and what use cases can such a scalable system realize. (the leftover drawing can be seen at the top middle of the board). This was important so that everyone with less knowledge about the project can be at the same level of understanding.

Following that we wrote down the goal of the token. (only got agreed in words, should have been written on the upper left corner)

After agreeing on a generalised direction we started listing objectives and constrains that has to be met while analysing different token models later. (this was extended multiple times while modelling)

Initial version: (obj — objectives, c — constrains)

obj: Maximise the level of each ID attestation at every level (pyramid)
c: GDPR compliance
c: IDs attested are actually, useful by individuals getting verified (receiving end)
obj: counterparty Risk — minimize
c: prevent Spam
c: (prevent) collusion
c: (have) incentives for services
c: offchain storage paid for

additions:

c: minimize/prevent bribery to claims
c: standards aligned
soft c: over 50% claim givers and attesters behave well

The first assessed model was the no-token and that started to advance towards more sophisticated models.

The 10 model discussed (right edge of the board) — as you can see on the comparing image, initially we aimed to have only 6 versions.

Final look
  1. No token, off chain data storage (PII)
  2. #1 + shared state ledger, standardized claims (eg verifiable claims)
  3. #2 + tokens for staking claims — fisherman — challenge reward process ( OK vs KO)
  4. #3 + tokens for staking attestations
  5. each claim follows, layered TCR — stake, time, #attestations, #references, interactions, +st., +ol.
  6. Block reward for each claim (weighted) once attested, once without.
    Stake on both ends.
  7. #6 with 3rd party ask against collusion
  8. #6 but web of trust verification against collusion, incl advanced governance to easily identify and fix collusion
  9. # + block reward if you (attester) store data for 2 years
  10. # + free use of storage/ cloud store for PII and how wallet uses

Hopefully this summary into the second TokenEngineering session will help the community to better understand the power of tokenization and how “simply” it can be done if methodical engineering is used.

Notes:

  • The table does not reveal system difficulty (it was discussed but not evaluated on the board).
  • The table is not finished as some questions are still open and un-answered.
  • There were additional constrains and objectives discovered but not included for simplicity.
  • This is only one way to develop a solution, there are other potential good resolutions as well.
  • The system architecture was not discussed in details in this article.

A big thank you for the group helping the project to better assess tokens, for OceanProtocol for hosting us and a very special thanks for Angela Kreitenweis as for organising the meetup and managing logistics.

If you are interested to join the TokenEngineering movement come and visit http://tokenengineering.net/community where meetups like this and others are linked.

Join Us! :) http://tokenengineering.net/community

--

--