How do Design a Token System

#tokenengineering #web3

Shermin Voshmgir
Token Kitchen
22 min readFeb 28, 2020

--

This post has been updaten on May 7 2023 with the new chapter contents of the 3rd edition Token Economy: Money, NFTs & DeFi . The post gives a brief introduction into the concepts of design thinking and engineering, and describe how the term engineering is being used in the context of token systems design by a growing “token engineering” community. I will analyze which questions are relevant in the design and engineering process of a new token system, and which knowledge domains are relevant to achieving this goal, depending on what type of token one aims to create.

The term “design thinking” dates back to the 1950s and became popular in the business community of the 1990s, when design thinking approaches found widespread adoption in the product design practices of startups and scaleups of the Web2 to strategically plan concepts for any new technology, product, or service. The term describes a set of creativity techniques for problem-solving of novel topics that are solution oriented, yet holistic in their approach. The process includes problem definition, ideation, solution-focused strategies, modeling, prototyping, testing, and evaluating, including iterative feedback loops, with a particular focus on user-centered or human-centered design.

With the emerging Web3, the term “engineering” is being used in the context of designing token systems by a growing “token engineering” community. The motivation behind using the word engineering (instead of design) is to do justice to the infrastructural and mission-critical nature of Web3 networks and many of their potential applications. Trent McConaghy states, “Engineering is about rigorous analysis, design, and verification of systems; all assisted by tools that reconcile theory with practice. Engineering is also a discipline of responsibility: being ethically and professionally accountable to the machines that you build, as illustrated by the Tacoma Narrows Bridge viewings and iron rings.” He was probably the first person to coin the term “token engineering,” hoping that “token ecosystem design would also become a field of rigorous analysis, design, and verification. It would have tools that reconcile theory with practice. It would be guided by a sense of responsibility.”

The terms “design” and “engineering” are closely related but not the same. Rather, they complement each other. While the term “design” might be a better-known and intuitive term, carrying a more subjective, creative, and even artistic meaning, the term “engineering” tends to bring the technical aspects, the composition of inert parts to create a predictable and robust whole, to the forefront. “A design is a plan or specification for the construction of an object or system or for the implementation of an activity or process, or the result of that plan or specification in the form of a prototype, product or process.” Engineering refers to the use of scientific principles to design and build machines, structures, and other items, including bridges, tunnels, roads, vehicles, and buildings.” Design is, therefore, a part of an engineering process. The term “engineering design” is used to describe the part of the engineering process which is open-ended and ultimately more subjective.

Similar to electrical engineering and public policy design, token engineering is about rigorous analysis, design, and verification of systems and their assumptions. Their assumptions need to be assisted by tools that reconcile theory with practice. As opposed to electrical engineering — designing a system within which we expect certain behavior — token engineering is much more comparable to steering national economies and public policy design, as it requires much more “fuzzy” modeling techniques. With the emergence of AI and better simulation tools, we might be able to design and deploy more effective purpose-driven tokens that also factor in unknown probability distributions, unknown or adversarial behaviors of agents, potential network externalities, and “tragedy of the commons” incurred to other parts of society.

While the “token engineering” community points out the necessity for rigorous software engineering practices, it seems to me that in the outlined theories and the lived praxis, the focus is often on the “technical engineering” aspects of a token system. A look at the composition of team members of most blockchain/Web3/token startups reflects this techno-centricity quite well. Engineering, however, is the practice of creating a technology that ultimately always has a social goal. Looking at engineering through a purely technological lens perpetuates a reductionist view of why and how we build technology.

There seems to be a growing understanding for the need of using the term “engineering” in the broader sense when designing a token system. Web3 with its distributed ledgers and smart contracts provides a governance layer and an economic layer for the Internet. If something goes wrong, the collateral damage is high. The crypto community witnessed the effects of a too narrow definition of engineering with “TheDAO” exploit of 2016, the “Parity” multisignature contract exploit of 2017 or DeFi hacks since 2020 that have been mentioned in previous chapters of the book. In all of the examples, millions were drained from one or more smart contracts in technical exploits, because the economic design and potential attack vectors of these smart contracts had not been sufficiently audited.

I therefore suggest that we explicitly distinguish between “technical engineering,” “legal engineering,” “economic engineering,” and “ethical engineering” when designing a token system.

Token Engineering Fields, Source: Token Economy: Money, NFTs & DeFi, by Shermin Voshmgir
Source: Token Economy: Money, NFTs & DeFi, by Shermin Voshmgir

Technical Engineering

When creating a token system, one first needs to decide whether to create a token using an existing blockchain network and other Web3 infrastructure, or if the token can be deployed on an existing blockchain infrastructure. While most token projects today chose to build on top of an existing blockchain infrastructure — most dominantly the Ethereum network — some use cases for purpose-driven tokens that steer DAOs have chosen to build their own blockchain network. The advantage of building one’s own blockchain network is that one can create an infrastructure with the exact technical properties that one needs. On the other hand, the overheads of development and maintenance are high, especially in the light of potential security risks, and blockchain infrastructure also depends on critical network effects. This is probably the reason why some token projects, which previously had their own blockchain infrastructure, eventually decided to migrate to a third party blockchain network.

Application tokens vs. infrastructure tokens: “Infrastructure tokens” — often also referred to as “protocol tokens” — are tokens that either steer public blockchain networks (1st layer), their second-layer protocols such as state channels, or other Web3 protocols such as distributed file storage networks. Infrastructure tokens are purpose-driven and have a very clear role in a public blockchain network or another layer of a blockchain protocol stack, namely to incentivize the collectively maintained network and to keep it safe from attacks. Infrastructure tokens are the internal currency of the protocol’s ecosystem. They can serve as block validation incentives (such as miner rewards), be used for transaction spam prevention, and they are often also needed to pay for transaction fees in the network. If a token is built on an existing blockchain infrastructure using smart contracts or similar approaches, the token is referred to as an “application token.” These application tokens are managed by the underlying blockchain network or similar distributed ledger, as well as other Web3 networks that might be needed for the token use case (e.g. distributed meta-databases or file storage systems). The most important design choices in the engineering process of protocol tokens and application tokens are related to questions of security, scalability, level of decentralization and privacy of the blockchain infrastructure they chose to use. The technical engineering decisions will need to consider the constraints and trade-offs of one solution versus another. One also needs to consider potential interoperability needs of the token system one wants to build, and choose the right token standards to meet the needs of the token use case.

Interoperability: Since blockchain networks cannot natively communicate with each other, various interoperability approaches are needed. If your token use case needs to transcend one blockchain ecosystem — and suitable interoperability solutions don’t exist — the choice of blockchain network might become crucial in order to minimize potential lock-out effects and maximize desirable network effects.

Security & durability aspects address the cryptoeconomic mechanisms that provide the level of security needed to make transactions in the P2P network safe from market manipulation and outside attacks, in the absence of clearing institutions. These security aspects are also relevant in the context of potential “durability” of the tokens which — in economics — refers to the ability of a currency to withstand repeated use. This means that the substrate of that currency should not easily vanish, decay, or rot. The Bitcoin token and many similar protocol tokens have so far proven to withstand time, being resilient against any type of censorship or network attack. A resilient network is expected to contribute to a “relatively” durable infrastructure upon which tokens can be reliably issued. If a blockchain network can be corrupted, repeatedly halts or defaults as a whole, one might not be able to use the issued tokens anymore. This would be the equivalent of a perishable medium of exchange that does not last over time.

Scalability aspects: The term “scalability” refers to the number of transactions that a blockchain network can settle per time unit (aka “throughput”), and the amount of transaction information that can be added to each transaction (aka “payload”). The first blockchain networks were very secure and decentralized but had little throughput and payload capabilities. Technical scalability solutions are faced with the trade-off between security, decentralization, and scalability in a blockchain network. Maintaining security and a high level of decentralization while ensuring a large throughput level of network transactions is an engineering challenge with a variety of tradeoffs. Monolithic blockchain networks can only achieve two goals at the expense of the third. Different scalability approaches are currently being tested to resolve the scalability challenge while reducing the tradeoffs. The concept of modular blockchains, where different blockchain tasks are performed by different layers in a blockchain network, seems to be one promising approach to tackling this trade-off and provide better “broadband” capabilities for Web3. The topic of scalability is discussed in detail in another volume of the Token Economy Series titled “Web Infrastructure.”

Standards: In the technical engineering process, one can choose from a growing list of standardized token contracts. The token standards used depend on the properties a token should have — such as fungibility, transferability, or expiry event. The introduction of standardized token contracts has simplified the token contract development process, as they offer a template for creating a range of token types. Today, every blockchain network offers their own set of token standards one can choose from when designing a token system.

Privacy aspects address the questions of what type of cryptographic primitives should be used to allow for the right privacy by design. Contrary to common belief, the Bitcoin network and similar blockchains does not provide full anonymity, but rather pseudonymity. The data that is included in a token transaction is public to anyone. This means that the Bitcoin network, and other blockchain networks that were designed with similar cryptographic primitives, have limited privacy by design. Anyone with the know-how and the means — such as access to certain data pools — can use the power of big data to correlate the metadata of other data points with the data points of a blockchain transaction. These other data points might be publicly available, or are available to certain institutions, such as banks and other financial institutions, including centralized token exchanges, or E-commerce platforms. National security agencies and other public authorities who can legally coerce access to the data collected by large institutions could also correlate one’s Bitcoin address with other data. Alternative cryptographic primitives have been deployed by newer blockchain networks and alternative Web3 protocols — such as “Monero,” “Zcash” or “Tornado Cash” — which aim to provide more privacy-preserving blockchain transactions. However, additional encryption can come at the cost of complexity and additional overhead. The topic of token privacy in general, and privacy tokens in particular, is discussed in detail in another volume of the Token Economy Series titled “Web3 Infrastructure.”

Token Engineering Domains & Disciplines, Source: Token Economy: Money, NFTs & DeFi, by Shermin Voshmgir
Source: Token Economy: Money, NFTs & DeFi, by Shermin Voshmgir

Economic Engineering

Economic engineering is predominantly required in the design of more complex token systems, such as purpose-driven tokens that steer the collective action of community members of a decentralized organization through automated incentive mechanisms. Purpose-driven tokens incentivize collective action toward a common purpose. Such common purposes could be consensus in a public payment network, the stability mechanism of a stable token system, resource sharing in a file sharing network, reputation and curation in a social network, or the general reduction of CO2 emissions, etc. Purpose-driven tokens open up a myriad of new possibilities when it comes to regulating collective action over the Web3 in the absence of intermediaries. This type of reward structure for individual contributions towards a collective goal is a paradigm shift in value creation, and is the subject of another book of the Token Economy Series titled “DAOs & Purpose-Driven Tokens.”

When they are designed well, purpose-driven tokens can coordinate the actions of individuals towards a common goal, usually by making certain actions in a network easier or more attractive than others, thereby indirectly restricting certain actions by making them less attractive. The goal is to preserve everyone’s choice to do what they want, but to make the actions that align with the purpose of our desired token system more attractive. Economic and political governance models of token systems are mostly in their infancy, and still subject to a phase of real life trial and error with every new tokenization use case that emerges. The main questions that need to be answered in an economic design process are:

Purpose: The definition of a clear purpose of the token is necessary for the further design process. Having analyzed over 100 token systems, together with the teams working with me, I have come to the conclusion that the clearer the purpose of a token system is defined, the more resilient the network remains over time. My personal opinion is that a token should only have one purpose. Aligning the interests of multiple stakeholders with individual preferences with one incentive mechanism is already hard enough if one wants to achieve only one purpose or goal. Furthermore, every participant has a slightly different perception of what they perceive or understand as the purpose. With multiple purposes or goals, the complexity in the incentive design will grow accordingly, as will potential attack vectors. With more purposes involved, one would probably need a multi-token system, which additionally complicates a resilient economic design that can fulfill its purpose in the long run.

How many different token types do you need? Some token systems have multiple token types to steer collective action within the network. Examples that are explained throughout the different volumes of the Token Economy Series are the stable token system MakerDAO (DAI and MKR) or the decentralized social media network Steemit (STEEM, SP, SBD). Other token systems only have one token, such as the Bitcoin network and similar blockchain networks. It can be generally assumed that the more token types, the more complex the network dynamics of steering that network, which is probably the reason why blockchain networks have refrained from using multiple tokens to steer the collective actions of their network.

Once the purpose and the token types deployed are defined, one can now derive the properties of the token(s), taking into account all technical, legal, or ethical constraints that could influence the feasibility and general dynamics of the token system.

Minting event: Tokens can either be issued against proof-of-contribution to the network’s purpose — they are “minted” upon intrinsic value generation within their own economic system. Alternatively, tokens can be issued upon payments with money that flows from outside the token system. An art NFT that is paid for in EUR or USD, or a data token that is paid within BTC, are examples for such extrinsic value creation.

Expiry properties: A token can be designed to come with an inbuilt expiry date or expiry event. Access right tokens and usage tokens typically have expiry events and have a linear token flow. Currencies are typically not designed to have an expiry date or other expiry event, with the exception of some alternative currencies. However, currencies could be designed to have an expiry date by design, which could reduce token supply and would be an ideal mechanism to prevent hoarding. This could potentially reduce economic inequalities that are typical in economic systems where money has no expiry date and can be hoarded or invested against potential interest rewards.

Transferability: Depending on the use case, tokens could be tied to a unique identity of a person or institution, and be designed to have limited transferability. Limited transferability automatically reduces the liquidity of a token, making it non-viable as a medium of exchange. Reputation tokens, for example, need to be tied to the identity of a person or organization in the network and should not be transferable. Transferable reputation tokens can be traded on the free market, making them non-indicative of personal behavior in the network, as was the case of “Steem Power” tokens in the Steemit ecosystem, which probably led to the downfall of an otherwise well-designed decentralized social network with quite an impressive traction and market value at a certain point of its evolution (the use case of Seemit is analyzed in another book of the Token Economy Series titled “DAOs & Purpose-Driven Tokens.”)

Fungibility & supply: If tokens are designed to have identical attributes, they have fungible properties and could act as a medium of exchange (aka payment token or currency) within and potentially also outside their token system. If they are designed to serve as the designated medium of exchange of a decentralized organization, the overall monetary policy of that decentralized organization — such as token supply, inflation rate, or relevant network taxes — needs to be conceptualized in the economic design of the token.

Stability: Short-term stability of value is one of the most important functions of a medium of exchange, so that it can serve as a unit of account and is fundamental for economic planning. As described in previous chapters, while Bitcoin introduced a groundbreaking consensus algorithm, it came with a rudimentary monetary policy that simply regulates and limits the amount of tokens minted over time. The Bitcoin protocol does not provide an economic algorithm that guarantees price stability. Depending on the type of token system designed, price stability might be desirable, especially in the case of payment tokens that are intended as a day-to-day medium of exchange. If price stability is needed, appropriate stability mechanisms either need to be designed into the main token, or else an additional stable token will be necessary. If the stability mechanism is not built into the main tokens, one can create one’s own stable token or use a third party stable token. The challenge with third party stable tokens is that one gives up economic autonomy and control over a key aspect of the token system. Building one’s own stable token means additional overheads and complexities, but potentially also comes with much more autonomy.

Rights attached: Property rights, access rights, usage rights and management rights are the most relevant rights to model into a token contract from an economic design perspective. While voting rights might not seem that important from an economic standpoint, they do have long-term economic implications. If voting rights are proportional to the amount of network tokens one owns, this will likely attract economically affluent stakeholders, who will be able to buy more voting power with more stake in the system — influencing the system to their advantage. The long-term power distributions need to be considered when designing the different rights attached to a token, especially voting rights, as they influence factors such as the Gini coefficient of the tokenized network, and can make network participation less attractive for smallholder token holders or new participants.

The theoretical and applied know-how that will be necessary to design a token system from an economic perspective can be found in economics, network science, cyber-physical systems, and sociotechnical systems.

  • Economics deals with the studies of economic institutions, policies, and ethics, including questions of resource allocation, wealth disparities, and market dynamics in the context of the production, distribution, and consumption of goods and services.
  • Network science studies complex networks, from biological networks to classic telecommunication networks, computer networks to social networks. Methods used include mathematics, physics, computer science, and sociology.
  • Cyber-physical systems are mechanisms that are controlled or monitored by computer-based algorithms, tightly integrated with the Internet and its users. Examples include power grids and large-scale transportation systems, which both share the property that behavior of uncontrolled human actors can create undesirable or even unsafe conditions in entirely counter-intuitive ways.
  • Sociotechnical systems: this term was first coined in the 1940s and refers to the interaction of social and technical aspects of private and public organizations and communities, online and in the real world. It refers to the studies of the complex infrastructures a society uses, such as the Internet and other communication networks, supply chains, legal systems, and human behavior. The relation can be either simple (linear cause-and-effect relationships) or complex (non-linear and hard to steer and predict cause-and-effect relationships).

Ethical Engineering

The design of token systems also requires ethical and political thinking. What type of system one wants to create is not a technological question, but a socio-economic and political one. Questions of politics, morals, and ethics will need to be answered, ideally before the design of such systems. If the founders of the token system fail to incorporate ethical questions in the design thinking process, they might create unintended “protocol bias.” The history of Web2 has shown that, eventually, all these political, moral or ethical questions will need to be resolved anyhow. However, if that is done after the fact — after a system has been created — biases are hard to reverse due to system inertia and the vested interests of existing system stakeholders (see the Cambridge Analytica scandal and the discussions about privacy, control, and social media governance that followed in its wake, and the challenges the Facebook network is facing right now). When considering political and ethical questions, it is not necessary to reinvent the wheel. Concepts such as engineering ethics are well known and can be applied to the creation of Internet-based systems, something that Silicon Valley and other big players of the Internet era have — for the most part — failed to do. In terms of token design in Web3, two of the most important ethical and political questions concern (i) the trade-off between institutional accountability (aka transparency) vs. personal privacy, and (ii) power structures in decentralized organizations:

  • Transparency vs. privacy: The trade-off between public and private interests is an age-old political discussion that has been subject to the studies of political scientists and sociologists. While the privacy rights of individual stakeholders of a token system are important, they might sometimes undermine public interest. Let’s take the case of supply chain transparency: while most consumers probably agree that they wish for more information about what happens along the supply chain of goods and services, the act of providing such a level of transparency could infringe on individual rights. Take the example of cameras installed in a factory to monitor how products are manufactured and by whom — in order to avoid child labour. The individual data that is collected can infringe on the privacy rights of the factory workers, depending on how that data is revealed via a public Web3 infrastructure. It is therefore crucial that the founders of a token system also hire a range of experts from different fields of social science, who work hand in hand with lawyers and cryptography experts of their team to account for such design questions.
  • Level of decentralization & power structures: The trade-off between decentralization, security, and scalability is a much discussed topic in blockchain networks. The trilemma of decentralization raises the political question of how much decentralization is needed or wanted, depending on the use case and the values of the community. The more decentralized a blockchain network, the slower the network is, and vice versa. Otherwise, one has to sacrifice the security of the network. Power structures are also important when designing incentive systems and voting rights into purpose-driven tokens. If the voting rights in a decentralized organization are tied to the amount of stake one has in a system, this creates natural power imbalances that favor those with more money (in the system).

Legal Engineering

When designing a token system, one also needs to understand the regulatory implications that the technical, economic or ethical design might have. What might be feasible from an economic or technical perspective is not necessarily legal. One of the greatest challenges of Web3 is that the governance infrastructure of the Internet is by definition global. This is in stark contrast to how our analogue world is governed. The people on our planet are divided into more than 200 nation states and their respective jurisdictions, and this balkanized governance structure has not sufficiently caught up to the realities of the Internet — even after decades of its existence. While this is not a new problem, there are particular challenges when it comes to the decentralized nature of Web3. Web2 applications were easier to regulate, as its governing institutions were centralized (operated by companies or NGOs) with headquarters in one particular jurisdiction. In spite of this, many Web2-based organizations faced a myriad of legal challenges when trying to cater to the needs of all jurisdictions worldwide.

The regulatory aspects of token systems that are governed by a decentralized organization are a complex legal topic that could cover a book on its own — or rather various books; especially when we take into account all the jurisdictions in the world that one needs to consider. Furthermore, many regulatory institutions still need to fully grasp the potential and implications of Web3 and its tokenized applications. The lack of a clear taxonomy is another challenge one needs to master, as it makes the communication between founders of Web3 protocols and regulatory authorities harder. While a better taxonomy can be developed — as is the intention of this book — this is not always possible. The potentials of Web3 and its applications are such a paradigm shift to previous iterations of the Internet that it is often hard to describe certain phenomena of the new world with vocabulary of an old world: (i) Web3 is collectively governed and maintained by disparate groups of people and institutions without clear ownership structures, so who do you sanction, and where? (ii) The boundaries of what constitutes money, a financial product and the real economy are shifting and merging with tokenization. How should regulatory bodies identify what they want or need to regulate, and which regulatory body is actually in charge, given these paradigm shifts?

Entrepreneurs are often confronted with uncertainties of how the regulator might retroactively classify the token. In this interregnum founders and developers are trying their best to break frontiers in spite of regulatory uncertainties. To provide at least some kind of certainty in spite of such challenges, some jurisdictions offer governmental sandboxes to guarantee innovation while allowing for a process of “regulatory learning.” Where such sandboxes are not offered, the anxieties on both sides of the regulatory game are considerable. The legal engineering process of tokens therefore refers to the question of (i) how one can conceptualize a token system that is regulatory compliant, or (ii) how one needs to design a token system so one can avoid regulatory scrutiny and be able to remain operational. The best token system will not last too long if it is considered illegal by regulators in the relevant jurisdictions.

Legal engineering might be less challenging in “simple token systems.” The term “simple” is commonly used in the complex systems domain. In the context of token engineering, it refers to the fact that the dynamics of the business or governance models of a potential token system are well known. This is in the case for tokens that represent central bank money, securities and other assets, identification and certification processes, voting rights, vouchers and coupons, entry tickets, and other access or usage rights. The respective business or governance processes of these use cases have been stress tested over decades, sometimes centuries. Potential loopholes have been closed over the years in a process of trial and error, and there are clear procedures, best practices and regulations in place. The legal engineering process for tokenizing such use cases would deal with the questions of how to make the tokenization of existing property rights, access rights, management rights and voting rights legally compliant with local legislation. While this is a challenging task overall, it might be less complex than trying to design the legal engineering process for purpose-driven token systems. The legal engineering process of decentralized organizations that are steered by purpose-driven tokens is much more challenging and potentially complex, because what needs to be regulated is in itself a new thing and creates additional sense-making and classification requirements for authorities. In both cases, these questions are most relevant to the legal engineering process:

  • Are the project founders operating a centralized organization that is incorporated in a specific country and easily sanctioned by that country’s jurisdiction, or is the token project operated by a decentralized autonomous organization with anonymous founders and network developers or network operators who live in different jurisdictions?
  • Which transnational/national/local jurisdictions apply or need to be considered?
  • Which legal aspects need to be considered? Examples would be privacy laws, securities laws, financial market laws, foreign relations laws and sanctions lists — all of which are governed by different regulatory bodies within the same jurisdiction.
  • Given the token properties that have been conceptualized from a technical, economic and ethical perspective, one can now start to collaborate with the legal experts in the team to evaluate how to best design the token so it fulfills its goal without being sanctioned immediately by the jurisdictions that might be most important to the project’s founders.

Conclusion

In order to be able to adequately cover token engineering questions mentioned in this chapter, the founders of a token system need to work with a diverse team who work hand in hand and who — together — can cover the range of interdisciplinary topics that need to be addressed in all four fields of the engineering process. Having lawyers, economists, and social scientists as part of the team in addition to the technical engineers, on executive level and below, will be paramount to developing resilient token systems. However, interdisciplinary work takes time and effort, since all four categories overlap, and the communication between the different disciplines requires some ramp-up efforts. One of the biggest challenges — which might seem mundane at first — stems from the fact that the vocabulary used by different disciplines often differs when describing the same phenomena. The same word, in different disciplines, can often mean a very different thing, and lead to serious misunderstandings.

The quick and dirty approach described in the mantra “first develop, then iterate,” which often shaped the development process of Web1 and Web2 applications, does not play out well in Web3. Once the bias is baked into a protocol of a network operated by a decentralized network of stakeholders, it is hard to revert the changes without consensus of all network actors. I firmly believe that the Web3 community needs to move away from Silicon Valley “meme-based development” to an “engineering-based development” that includes all aspects of the engineering process. “Simple token systems” will probably require predominantly legal and technological engineering, while more “complex token systems” will need a good balance of all four areas.

Thanks Michael Zargham, Claudio di Ciccio and Sofie Schock for the input of the initial version of this blog post.

References & Further Reading

This QR code will lead you to a website that contains all the references to the source materials used for the research of the chapters and should also provide a reading list for those who are interested in a deeper dive into the topics presented in this chapter. Where possible, the links will be updated on a regular basis to prevent the issue of broken links. It is only scannable on white or light colored backgrounds and might not work on dark backgrounds. The website is also accessible under following URL: https://bit.ly/3Mecwuv

--

--

Shermin Voshmgir
Token Kitchen

Author of ‘Token Economy’ https://amzn.to/2W7lQ8h// Founder @tokenkitchen @blockchainhub & @crypto3conomics// Artist @kamikat.se