Part 1: Introduction
Part 3: Inflationary Aspects
Part 4: Deflationary Aspects
Part 5: Numbers
Part 6: Summary
Part 7: Future
Part 1: Introduction
When Bob Dylan wrote the song “Times They are a Changin” it was the 1960’s and the world was going through a geo-political movement. Now here we are today, those words and sentiment couldn’t be more accurate. The times of change are written all over the wall. Every area of life is affected by new technologies and the paradigms that come with it.
Therefore, it’s not surprising that Frederic Laloux’s book Reinventing Organizations has been published mere months after Vitalik first proposed Ethereum and the smart contract idea in late 2013. For me, as an entrepreneur, anthropologist and psychologist it looked like a synchronistic convergence of the thought streams of distributed systems engineering, economics and social organization.
Laloux is a canadian researcher who examined companies that were not governed by top down hierarchies but self organizing principles. With that, we had the latest culmination of novel and successful ways of social interaction formulated. On the digital side, J.R. Willett contributed with his whitepaper the idea of a digital decentralized entity and what was later to develop into the ERC20 standard. With Ethereum’s smart contract functionalities, these ideas became fully expressed into one seamless concept.
It allowed for the first time in history to create digital economic entities governed securely by smart contracts. The autonomous digital organization. The DAO was born. Not only as a concept but as a digital organism on the Ethereum platform.
The DAO was incredibly successful and garnered worldwide attention (at least with people who paid attention).
The DAO was a revolution. It’s socio-systemic implications is nothing short of a massive phase shift at the base layer of human (economic) interaction. A promise of borderless global self-organization.
The next step of human self-organization leading to a more advanced future.
Gone seemed the times of “I tell you what to do!” Global identity, global citizenship and global services would be the hallmark of this development. Digital government models and fair automatic digital taxation and income (taxemes) distribution would lead into a new future.
It is a form of organization that could exist in the digital sphere alone. Members would secure their participation and dividends from smart contracts on the blockchain. Very early iterations of that idea were boardroom and slock.it’s DAO.
The excitement of the potential of that global form of organization and collaboration carried the first fundraises. That’s why token sales were pure token sales without any bonus rounds, pre-sales or accredited investor requirements. It was seen as a sign of ethos to not have big VCs on board or bow down to their peculiarities and wishes.
Unfortunately, as most times, money speaks: Incentive schemes and discount structures were introduced in 2017 to mimic power dynamics known from the VC and private equity world.
At the same time incumbent power holders started to feel slightly tickled, a pre-sensation of feeling threatened in their power. The sleeping giants of governments and financial institutions started to interject. It led to a still ongoing slow dance between regulators, incumbent financial institutions and the crypto power holders. In that dance the seats of power and money are being redistributed. Unfortunately that dance, while in some ways necessary, dilutes the essence of actual technological and organizational potential.
Be as it may, the focus shifted towards the utility of the token in the network and how to properly align the incentives of all players at the table. The word token economics was born. It describes the balancing act of stakeholder interest alignment. It is fused into the economic structure of a blockchain company and determines allocations, sizes, amounts, incentives and roles for all stakeholders. It determines how value is created, distributed and flows through the system.
The most challenging task for a company in the blockchain industry today is to align incentives within the token economy but also to tie the crypto economy to real world revenue and productization. As can be gleaned from above a fully functional DAO adapted legislation would make this task easier. We do not have a fully supportive digital legal framework for such initiatives (yet). We are in a hybrid stage where legal entities are still part of non-digital legislation but token economies have already been deployed.
In that environment, the classical approach on how to tie business model and token economy, together, is to found a non-profit foundation. The foundation would raise enough money to ensure the development and survival of the technology.
The novel aspect of tokenization within a token economy is the market aspect. In order for the economy to work, the token needs exchange mechanisms and an active market. Aside from all organizational and legal principles the token now functionally touches the financial market logic. It is subject to the dynamics of a fully liquid and tradeable asset. Utility drives the token price, as higher demand equals higher prices for commodities in self regulating markets. The general idea is, if the token appreciates in price then an organization (typically set up as a foundation)can liquidate some of their holdings to fund operations.
As a side note: This scenario creates two problems:
- The short history has shown that a company funded in such a way is not necessarily viable through its product and its actual market adoption. The product has little or no users and customers. As a result the company in essence becomes a hedge fund. Augur, EOS and many others are an example of such a dynamic.
- Liquidating foundation funds at high rates is seen as a betrayal to the community who supports the company. Notably Ripple and Tezos have been heavily criticized for liquidating hundreds of millions of USD worth of their token holdings.
Tokenization of an asset is at the core of any utility question. It is one of the most important and exciting aspects of a token economy. Tokenization unlocks new business models and economic possibilities.
At Constellation we tokenize throughput which allows big data applications to pay for throughput and have access to consensus, validation and notarization functionalities. Furthermore the $DAG tokens enable the ability to attach value to information. This is the fundamental basis for a data marketplace and is essential to creating a thriving economy of big data. In the same way you use smart contracts for a business logic you use DAG to transact data value between participants in the network. If you want to exchange information it is essential to use $DAG. The Network itself is a data exchange mechanism and ties utility to the token.
Decentralized networks are different from other IT infrastructures in the way they incentivize operators and users. As part of the incentivization structure a digital token or currency is usually employed.For users, operators, and token holders it is therefore important to understand:
- Why a token is essential to this economy
- How the token economy works in relation to the business
- What role a token plays in the token economy
- How a token moves through the token economy
- How a token economy is governed?
Most token economies, with Bitcoin being the most prominent one, are designed to be deflationary. This means a fixed maximum supply of tokens. As demand for the token increases and the release of new tokens within the fixed supply decreases, value usually increases. Strong market dynamics in addition to is the value proposition and overall mechanics of the network that convinces engineers, users, customers and token holders.
That means the overall dynamics are already deflationary by design. Deflationary in regards to the overall supply. However, within the overall design the economics still underlie:
- Liquidity releases into the network through mining (inflation of supply)
- Liquidity decreases through usage, staking and burn (deflation of supply)
At this point, it is necessary to clarify the deflationary terminology used in the digital environment in contrast to the standard economic terminology of deflation and inflation:
- In an economy, the price of goods usually increases with an increase in monetary supply (inflation). High inflation is commonly seen as bad as it effectively devalues a currency and decreases purchasing power of the participants in an affected economy.
- A surplus in product or a shortage in money supply to purchase products is referred to as deflation. This situation forces vendors to decrease prices as their is too much offer and not enough demand in such an economy. If there is too much inventory, companies start to cut costs and employees, which in turn exacerbates the supply of goods and further decreases the purchasing power in an economy.
Both, too much deflation and inflation in the standard economic model are seen as problematic by economists and central banks. Both can lead into recessions. There is a lively discussion, if a deflationary economic model can be sustainable in a real world economy.
In this regard, Constellation does not aim to replace the entire physical economy but is limited to an economy of the digital space. Additionally, the digital deflationary economic model of Bitcoin has propelled the asset to a worldwide known “brand-name” and made the asset class widely popular without centralized intervention or help. In the wake of Bitcoin, cryptocurrencies are now one of the leading topics of today. With its novel deflationary economy as a template, it has sparked the interest in exploring alternative economic models.
In that sense, Constellation is at the forefront of a socio-techno-economic experiment. As such, Constellation has to bridge not only the gap between a vision of a future economy and today’s business dynamics, but also how to tie a deflationary digital token economy with its own unique dynamics and requirements to the existing economic business logic.
Constellation is not alone on this journey. Every blockchain project that has been funded, or is heavily based on a crypto-economic component, has to tackle both challenges.
The usual answer in the industry has been to create a Foundation, raise double or triple digit millions and develop a technology while most of the focus is on the token price.
Actual tech adoption, integration and most importantly, business logic and monetization of said products have been neglected in many cases.
Fuel for the Postmodern Paradigm
Enterprise use cases like timestamping, and notarizing workflows (supply chain, logistics, ERP driven processes, bookkeeping, real estate etc.) are immediately obvious to anyone looking at the advantages of blockchain technologies. They formulate easy to grasp “ease of use” or “ease of transaction” advantages.
While these use cases are exciting, they do not articulate a vision that screams mass adoption and solves problems that amass to an industry that generated more interest than any other industry. Constellation is building a secure communications protocol and mechanism to transfer big data. It is the infrastructure layer that enables the decentralization of data monetization and creates a network, built up by a community of node operators, that validate and govern big data. The Constellation Network will thrive through a combination of business and enterprise organizations that want to delineate responsibility and accountability of data (PII, OTIT, IT) and an open source community of developers that want to build accountability from the bottom up. This is a connected future with scale.
So, for us, defining a larger vision is fuel for the postmodern paradigm and the answer to the question:
“Why blockchain and how does it benefit my bottom line?”
At Constellation, we are building a network that is ready today for the data and throughput needs of tomorrow. As the only one of its kind the network is able to process big data applications and data streams on-chain.We envision a data economy that is vastly different from what we are used to today. A knowledge economy where data can be securely exchanged and monetized through the network itself. The network not only as a connector but a as marketplace infrastructure that enables validation, scoring and exchange.
We have outlined a vision of a new industry and how Constellation is a new company for this new industry by bridging an economy of the future, a digital knowledge industry.
Now, we propose the workings of our digital token economic model and how we intend to link the token economy to existing business logic.
Part 2: Technical Fundamental Dynamics
On a fundamental level a decentralized network operates nodes, that come to a consensus over the state of a transaction or information set via a consensus mechanism.
The network usually incentivizes nodes to contribute their resources of electricity, hardware, money and time. Decentralized networks are not operated by a single entity but many network participants. Therefore the network needs to have an incentivization mechanism in order for participants to contribute to the network with their resources of money and time (in the form of hardware and electricity).
We have seen that most decentralized networks since Bitcoin deploy scarcity as a primary incentive scheme. By capping the available supply or severely limiting the issuance of new supply, a token (like $BTC or $ETH) within a network becomes more scarce over time. With an increase in scarcity value usually increases.However, in a traditional Blockchain network throughput is determined and capped by Blocktime and Blocksize. In such an arrangement throughput is not directly linked to the token value.
At Constellation the following elements are essential to maintaining a thriving digital economy:
- Reputation (Meritocratic, reputation based consensus)
- Throughput (Transaction speed)
- $DAG & Rate limiting
As such, I will show how each element is related to the Constellation token economy.
Constellation has decided to develop a novel reputation based consensus mechanism named PRO (proof of reputable observation). This consensus approach aims is more energy efficient over traditional mining in a proof of work environment while allowing light node deployment on small devices. It also aims to mitigate the issues around proof of stake with its incentive towards collusion and oligarchic elements.
(For example: If 4 out of 7 POS nodes collude and someone would change the code in such a way that any proposal from a colluding node is accepted, they can allow one node to print to that node.)
Reputation, instead of work or stake is the element that determines the reliability and the quality of the behaviour of a node. In that manner within each finality window, nodes have a reputation score assigned to them. The score ranges from 0 to 1 with 0 being the ideal state. Nodes with an ideal reputation score of zero are able to get access to the full amount of throughput and validator (mining) rewards. Nodes with less than ideal reputation of 0 still contribute their full resources to the network but they get less rewards and throughput granted. If these nodes continue to exhibit less than ideal behaviour mirrored in an increasing reputation score (1 being the maximum), they eventually will be removed from the network.
A DAG network’s throughput is not determined by Blocksize but by the number of nodes participating in the network. Each node provides compute resources to the network. The more nodes join the network the more resources in form of throughput are available in the network.
In the following we will see how throughput, reputation and $DAG are linked.
In the Constellation network, nodes reach consensus in grouped node clusters via gossip. Each cluster reaches consensus at a different time. The same is true for consensus between different non-overlapping data (nodes).This allows the network to process transactions asynchronously and is the hallmark of a DAG protocol.
However, ideally consensus should cover the state of the entire network. Therefore all “local” consensus round results converge in a snapshot that “freezes” the entire state of the network.
The convergence of the network to the snapshot is called finality window. It allows the network to reach an overall consensus of its entire state, which is then “frozen” into a block.
Tiers are different classes of nodes that relate to the compute resources they contribute to the network. When the network launches in Q3 2019 there will only be one level of nodes (tier 1) that are called foundation level nodes. Tiers enable the protocol to adapt rewards and consensus in a weighted form into the edge and fog areas and use cases of the network. The planned release for node tiers 2 and 3 is Q1/Q2 2020.
In the context of transactions we also speak of a basic tier. This relates to the free 1tx per user per finality window. It stands in contrast to the paid component of network throughput and the API call model charged for enterprise.
$DAG, Rate limiting & Tiers
While $BTC is not directly linked to transaction speeds and throughput criteria, $DAG is tokenized throughput or bandwidth in the Constellation network.
As a basic functionality, we have introduced a basic tier that allows users to send single transactions for one-off P2P payments as an example. Such a transaction should be free.
This introduces the concept of rate limiting: In the basic tier, each user has an allocation of 1Tx per finality window. It is a participation that is similar to running a bittorrent client but not providing any resources to the network. The rate limit will prevent such users to download much or at fast speeds. However, it is enough to interact on a basic level. As such the role of $DAG is similar to Gas in Ethereum with the difference, that there is a basic option. One individual user can still opt to pay an optional fee which will increase their chances of having their transactions accepted faster.
In order to process big data applications, data streams or connect devices the rate limit needs to be increased by contributing own resources in the form of $DAG or throughput. Nodes that validate transactions lend their throughput to throughput consumers for a fee. Simplified, the fee lifts the rate limit for the user and the validator nodes earns that fee.
While the network itself is continuously scalable the throughput within a finality window is fixed. (Newly joined nodes only participate in consensus in the next finality window). This means the supply of throughput is stable and finite within each finality window. When throughput needs increase within one or several finality windows the price for the same throughput allocation increases. In that way throughput and $DAG are linked by the market dynamics of supply and demand.
In the fully built out network one could imagine purchasing throughput directly in the node or wallet UI with a credit card. The allocation is then used up over a certain amount of time and needs to be replenished. This entails a user’s decision in accordance to what they want to do on the network and what their needs are. Would they purchase at spot market price or leverage the anticipation of future price/throughput projections? In that way users could manage their throughput needs in a similar way a cloud service subscription works. The difference is that tokenized throughput opens the field for a throughput commodity and futures market.
Part 3: Inflationary Aspects
As mentioned earlier, within the general deflationary bounds of a decentralized network, there are also inflationary mechanisms at work that influence the perceived value of the token in the network. This is important as $DAG is a tokenized measure for throughput or in other words the financial state channel of the network that allows different state channels to communicate with each other. As such, $DAG enables the “cross border” data economy of the future. In this context a border can be any system boundary (e.g individual, company, market vertical, industry, region, country, etc.)
Validator Rewards (Mining)
In order to incentivize nodes to contribute their resources to the network, the protocol rewards their contribution with $DAG tokens. The supply of these tokens is 1.6B DAG.
50% of all available tokens will be distributed within the first period of 2,5 years.In a step function manner the supply halves for each successive 2,5 year period.
The rewards are located in the specifically assigned validator rewards wallet.
Wallet address : 0x0EeF872B21cf4cfF3d793731CaEE6512211458F4
Rewards introduce the liquidity increase into the network over a 10 year period and function like Bitcoin with a rewards halvening every 2,5 years. This means early contributors who take the resource risk in contributing to the network get higher rewards than late comers with less resource risk.
The early nodes that join the network are called foundation level nodes. They all have the minimum hardware requirements necessary to operate a stable network. As the network scales Constellation will add more node tiers that are designed to enable lightweight devices with lesser hardware requirements to join the open network.
Rewards in $DAG are distributed by the protocol at the end of each finality window (snapshot) in relation to the node tier and reputation score of a node. In an ideal scenario a node performs consensus with the reputation score 0.
We plan to start the network with up to 100 nodes. Once network stability has been established, we plan to scale the network up to a 1000 nodes. All nodes will need to stake 250,000 DAG in order to participate in consensus. (More details in chapter 4.2)
The total available pool of rewards is determined by the finality window (snapshot time). Snapshot time may increase or decrease with the number of nodes in the network. Thus rewards may vary in accordance with snapshot time. In addition, factors like less than perfect reputation scores or earned transaction fees from throughput purchases can influence the actual payout amount.
The amount of rewards can be approximately calculated for each node by dividing the rewards available for each snapshot by the number of nodes active in the network.
All assumptions below do not constitute an offering or a legally binding contract. It’s purpose here is to show how the number of nodes active in the network relate to DAG rewards over a given time window. As DAG is traded on exchanges a hypothetical assumption about value at certain points in time can be made.
Part 4: Deflationary Aspects
Deflation is determined by
- Utility through network usage and in-house products
- Node staking & Token burn
4.1 Network Usage
Not strictly deflationary, but underlying the dynamics of supply and demand, high load network times increase price for $DAG and bind more tokens than in low load times.
As outlined before, each user and node has an allocation of throughput in any given finality window.
Once a user or node operator wants to process data in the network in a meaningful way, additional throughput will be needed. The price of throughput in the network is determined automatically by the protocol through an auction mechanism that is very similar to what Ethereum deploys with their GAS.
Since the throughput in a given finality window is finite the highest bidder gets the throughput allocation. All others need to wait until the next finality window opens. In that manner prices are directly tied to network usage. High load times increase price, while low load times decrease price. The fully realized version of the utility of data and the exchange of knowledge across the network is our version of the Knowledge Graph. A network of interwoven state channels, forming a database of referential knowledge. The Knowledge Graph ties directly into the Constellation Vision of open data marketplaces and the knowledge economy.
4.2 Inhouse Initiatives
Data products are already a trillion dollar economy. Success stories like Cloudera and Databricks show the massive demand for big data related products and services. IBM’s acquisition spree in the cloud, stream and data providing market as well as the notable RedHat acquisition, paint a picture of an even more data centric future.
Data products like address data, location data, investor tax filings, weather sensors, price feeds, movement data, customer preference data, AI training data, analytics etc. are in high demand. Demand will increase with the growth of IoT devices and sensors that generate and contribute to the data products of tomorrow. In that sense Constellation not only allows for the secure connection, traceability and interoperability of devices but also enables their data monetization.
As outlined in the vision paper we are pursuing the Enterprise offering with Spore Technologies. As a result of end-user and enterprise network stands the Knowledge Graph. It offers further utilization and productization value.
In regards to the token economy Spore Technologies enables to:
- charge for hosted services (which may be creating on premise private DAG networks or it means purchasing $DAG and creating state channels on the public network on behalf of our client).
- charge on a per event/message basis, API call, or per node basis
- purchase DAG for extra throughput on the network when needing to validate and create an audit trail for mission critical data.
- We foresee that Constellation Network Inc, will act as the custodian to purchase DAG, on exchanges, on behalf of our clients.
The Knowledge Graph
Constellation’s unique feature is that it allows for direct on-chain data processing. No other blockchain infrastructure is able to do that. Ethereum as a world computer falls short in that regard, as it cannot process any sizeable amount of metadata.
For the data and knowledge economies of the future this is an essential feature. In an in-house project, Constellation aims to pre-seed the Knowledge Graph in order to drive network adoption. Additionally the Knowledge Graph enables users to monetize data streams and set up data products themselves.
The Knowledge Graph is the graph network come alive and “knowing”. With the knowledge graph a data marketplace emerges when data sources are fed into the network for validation and notarization. It allows participants to exchange data and assess data quality without the need to violate existing privacy regulations. Different sources and feeds of data can be scored and ranked against each other.
The Knowledge Graph drives value in two ways for all network participants:
- Direct productization and monetization through data products (APIs, Oracles, data)
- Network value and token value through utilization
Example: Address data needs to be accurate and up to date. There are several service providers supplying address data sets. However, the data quality between providers differs. One cannot determine the quality before making the purchase and the provider cannot publish data freely to enable pre-purchase insights without breaking the privacy regulations. There is also no way to determine which provider has the most accurate data and if and how the data quality changes over time.
As each use case and data stream is on-boarded onto the Constellation network the Knowledge Graph grows and becomes more valuable. Each (additional) datastream that feeds into the Knowledge Graph represents ongoing throughput needs in the Constellation network. In this way, Constellation will not only be utilizing the network bandwidth towards a productized outcome, but also incentivizes engagement in the token economy where throughput needs are accessible by contributing network resources or purchasing throughput allocations with $DAG. Increased throughput demands will increase the value of throughput with its tokenized representation $DAG.
4.3 Staking in Nodes
Validator rewards are contributing to the liquid token supply in the ecosystem, while staking for nodes decreases liquidity in the ecosystem.
In order to run a foundation level node a minimum of 250,000 DAG is needed. This feature is designed to incentivize early adopters and token holders to utilize their tokens in order to contribute their resources to the network.
The network will grow in its first iteration after Mainnet launch to a 100 nodes locking up liquidity of 25,000,000 DAG. We project that in its fully built-out mode the staking in nodes may bind up to 250,000,000 DAG.
$DAG tokens are staked in the node’s wallet. The protocol checks if the required minimum amount is present in the wallet before allowing the node to participate in the consensus round.
Constellation treats staking in nodes different than others, as staking enables access to the network but does not influence consensus. Thus the reputation based consensus remains truly meritocratic and does not entail the issues associated with giving the highest bidder more influence in the network.
4.3 Enterprises & Token Burn
Constellation’s enterprise product Spore is used to leverage the technology to enterprises and to improve and further their existing business models.
Enterprise will need a fully developed “cross border” network as described in the Constellation vision. New network and data based business models will require interoperability between networks and the confines of a system, a legal, a region or a country.
However, in the beginning use cases are focused on more isolated tasks until the data economy emerges as envisioned. From our experience, enterprises do not want to convert or engage in the financial aspects of a cryptocurrency at this point. The challenge is as outlined in our vision paper, to marry the enterprise and crypto economic models with each other.
Therefore, Constellation charges the enterprise clients in a SaaS model fashion with a throughput guarantee that is equivalent to an SLA.
In order to connect their throughput needs to the Constellation network, Constellation will:
- host the nodes for the enterprise client
- burn 10,000,000 $DAG for each partnership that is onboarded on the network regardless of contract size or current market value of DAG.
The tokens will be sent to the burn address at 0x0704201907042019070420190704201907042019
- the first round of commitment is an allocation of 4.04% of the total supply with 150,000,000 DAG tokens equalling the first 15 onboarded partners.
- announce partners and burned allocations in a quarterly report. Should standing NDAs restrict the public mentioning, we will still issue an anonymized report.
- decrease the supply from the foundation holdings in the initial stages
Once the model has been proven successful and enterprise adoption and partner onboarding onto the network is established, we consider different bonding mechanism and token purchases on behalf of our partner on the open exchange at market rates. For a future vision, one could imagine a stripe-like integration to purchase throughput directly in the wallet interface while also being able to offer throughput or data products directly for sale as a node operator in the node interface.
Part 5: Numbers
The initial token allocations were considered by many to be too heavily skewed towards founders and foundation. As we believe in open source and community principles the entire founder allocations (288,000,000 DAG in total) have been burnt to reflect the long-term commitment and community approach in the token model. Additionally, Constellation had already doubled the pool of validator rewards from 800,000,000 to 1,600,000,000 DAG in late 2018. The intention for both was to increase community engagement, attract network participants, validators and developers and show a long-term commitment on the founders side.
These steps resulted in a relative change for each allocation category favoring community members and token holders.
The shifts mentioned are also reflected in the wallet structure. We have transferred all tokens to their respective new wallets and the allocations are listed below. We have informed coinmarketcap about the transfer. The foundation has access to all wallets listed (except the burn wallet) and the vesting periods are mentioned.
Apart from vesting and predetermined release schedule for the validator rewards, the tokens are not liquid.The foundation tokens are not locked up. However, the foundation does not intend to release tokens from the wallets labelled “Foundation” into circulating supply in the near future. Exceptions are releases related to liquidity provisioning for new exchange listings.
Part 6: Summary
Constellation enables large scale:
- Schema validation for sensor and data pipelines
- Secure connectivity of devices, sensors, data pipelines etc.
- Data stream processing enabling Big Data and ML applications
- Data marketplaces and knowledge economy through the Knowledge Graph
- Cross border data exchange and standardization ( a border can be a user, a company, a region, an industry, a market, a country or an entire continent — wherever there is a membrane that needs to be permeable in order to transmit, share, exchange and secure data beyond a silo or system barrier)
Token economy is regulated by:
- Protocol which is engineered with capped total supply
- Regulation of supply and demand by network utility and usage.
- “New” liquidity enters the network through validator rewards
- The first 15 enterprises partnerships will generate each 10m DAG liquidity decrease in the network.
- Node validators stake $DAG in order to be able to participate in the network and consensus.
- Consensus is not influenced by the amount of DAG in a node wallet after minimum requirements for participation are fulfilled.
Part 7: Future
The blockchain industry is still comparably young. The bleeding edge and best practices are changing quickly, especially in light of new or differently enforced regulations on a global basis. In that sense the ICO model has, with some regulatory help,rendered itself obsolete within a very short period of time. The follow up IEO model does not attract the same amounts of capital as in the ICO mania phase of 2017.It remains to be seen how upcoming token offerings will be structured as the fingerprints of large financial and corporate institutions are already clearly visible in a market where individual contributors seem to play a more diminished role.
As another example, the initial token economic standards of how to structure a token offering have developed since Constellation has begun developing the protocol. From the simple investor and pre-sale allocation economics, we saw a move towards more utility focused models. Now, Bonding Curves and Continuous Organizations are the new frontier for economic models on the horizon.
Constellation has responded to these developments with changes in the token economic model. In that spirit, our model represents the best solution within the parameters given. At the same time we are constantly seeking to improve our economic model and some of the parameters outlined here may change in the future. Ultimately the goal is to ensure and increase the support among community, customers and developers alike while being regulatory compliant.
For some that may be perceived as volatile, for us it reflects the life reality in a real world marketplace and regulatory environment. Adaptation and evolution of Constellation with the maturing industry will be inevitable and should be seen as an indication of health and progress.
Tokenized throughput needs sources and sinks to work. One could argue that the ERC20 standard saved the day for Ethereum in this regards. For Constellation it opens up very interesting roles and possibilities for node validators but also customers using the Knowledge Graph and the Consensus available via API in the future.
We have formulated an attractive vision of a knowledge economy that harnesses big data use cases, connected devices and direct on-chain data processing.
We have fused enterprise economics with crypto economics in a unique way by burning supply for each onboarded partner. Validators, customers and token holders each have compelling reasons to support and engage in the ecosystem.
Going forward, we see additional interesting applications and token economic implications in the knowledge economy. In this sense, we will be examining on how crypto data commodities could work within the Constellation ecosystem and how to implement them.