Bancor’s Smart Tokens vs Token Bonding Curves

Simon de la Rouviere
15 min readNov 13, 2018

I’ve seen a few times in social media that people think I took Bancor’s Smart Tokens idea and called it something else: Bonding Curves.

In actuality, I believe these designs were independently developed and arrived at similar conclusions from different angles at around the same time. This post is for clarification and secondly, for personal reasons, the desire to document this history.

A few months back, I thought it would be meaningful to show how the designs changed since I started blogging on token economics in 2014.

I got great feedback when I asked on Twitter about penning down this journey. I’ve already have a (tentative) name for it: A Corpus & Collection of Commentaries on Creativity & Curation Markets (I’m a sucker for alliteration).

This is a meaningful part to summarise parts of this research towards this e-book.


Bancor’s Smart Tokens and its design was birthed in August 2016 (according to them), and publicly announced 14 February 2017.

The name ‘Bonding Curves’ was penned in September 2017 by Zap Oracles team (not me) and was part of a larger set of designs from ‘Curation Markets’ that I’ve been working on. I’ve been publishing and writing about continuous token models in various forms since 2014. The eventual continuous token model iteration that followed the same model as Bancor’s Smart Tokens was designed & published in the beginning of March 2017 (first called a ‘Bonded Curation Community’). I had not heard of Bancor by this point (only became aware of it in May 2017 and only much later realised that it was the same thing after researching their protocol in depth).

We have the same goal of seeing a multi-token world for various new networks of value. It is why I started designing and researching token economics myself back in 2014. Currently, Bancor is mostly used to provide automated liquidity between existing tokens. They have an amazing new website live. Go check it out!

Market Making

Going back further & broader, these designs likely gets subsumed into research and design of automated market makers from the traditional financial world: from Logarithmic Market Scoring Rule used in prediction markets to Constant Product Formula used in Uniswap’s new decentralized exchange. Here is Vitalik talking about this a few years ago.

I am not deeply familiar with academic designs of different market makers and it could be likely that these ideas were developed much earlier, but have different naming. If you are aware of similar designs from more traditional financial literature, please share!

There’s a great story on how a medical researcher re-invented integration:

Notably: prediction markets do have a continuous token model. In a binary prediction market (yes and no), with one currency (say DAI), you buy two “outcome” tokens. The reserve/pot is eventually disbursed/forked to the outcome winners, upon which they “burn” their outcome token to get access to the pot/reserve.

Bonding Curves/Smart Tokens focuses on having some form of value (say ETH) be used to buy (and thus mint) a new token (say SimonCoin). The ETH is kept as reserve in a pool. You can then burn/destroy SimonCoin to get back some ETH depending on the amount of ETH in reserve. The price thus matches an algorithmic curve (in and out). If all SimonCoin is burned, there is no ETH left in the reserve.

The main difference is that Bancor’s codebase works primarily on only setting a reserve ratio (which influences the shape of the curve).

A Creative History

My own timeline morphs from working on ways to mint many new tokens, to thinking about various continuous token models & then finally arriving at bonding curves. I added the notable dates from Bancor’s own history for context to this timeline.

2013–2018: From Dogecoin to Bonding Curves.

My interest came from a desire to automatically mint new networks of value for people (and agents) to co-create value together.

The first penny dropped when Dogecoin came along: in the process of reducing the barriers to entry to mint new currencies, my thinking was that we’ll see them eventually just be tied to the memetic network of them. Dogecoin is proof of that.

Late 2013.

I first wrote about how I thought that in 10–20 years, we would all have a cryptocurrency tied the network effect of our memetic impact in the world. It would fit a gigantic long tail and would empower many people in new ways.

Jan — Dec 2014

Back then Mastercoin was recently announced and Ethereum hadn’t been announced. I had to get quite creative and designed a decentralized altcoin scheme on Bitcoin using donation-mining. It’s quite a Rube Goldberg setup:!msg/bitcoinx/mUb86IOeXdU/7pvmbFSPU3UJ.

Notably it emphasised why I was working on solving the technical problems:

“Extrapolating upon this: it means that we will see currencies being minted for anything and everything. Not all of them have to have million-dollar+ networks. We can have personal coins: an investment in my network effect. We can have city-coins: an investment in the network of a city, etc. We can even go smaller: an investment in news, articles. Our imagination is the only limit. As long these coins float with each other (and it’s very easy to do), holding one coin over the other is simple a personal statement of where you want your value to reside.”

During the rest of this year, I created my own cryptocurrency (The Cypherfunks), and worked on Counterparty/Dogeparty. The goal was still to keep testing the ideas, technically & philosophically.

October 2015.

Because I did my master’s degree in information overload, I started seeing the value of using fees to increase novelty in online spaces. This was the basis that eventually led to combining tokenization with information sharing: to increase novelty and eventually lead to Curation Markets.

Maximizing Novelty: The Potential Use of Blockchains in the Design of Sustainable Online Communities:

This article led me to meet the amazing folks at COALA Blockhain Workshops.

November 2015.

In this time period I wanted the technical specification to improve and started working with the community to design the ERC20 token standard on Ethereum. I presented a talk on “Tokens” at devcon1, speaking about the potential specifications & I was one of the top 3 contributors to the discussion on Github during that time ->

I’m really, really proud to have been part of this group of people who made this happen: arguably Ethereum’s first breakout success.

March 2016.

Probably the first time I started thinking about the possibility of automatically issuing tokens: continuous token models.

“A simple model on Ethereum would be as follows: investors can continuously join the organisation from the outside by allowing them to invest Ether in exchange for new shares. This issuance model follows an exponential decay. Each new transaction to invest, reduces the amount you get in exchange.

Invest 10 ether, get 10 shares.

New persons invest 10 ether, get 5 shares.


It fitted a curve, but there was no way to redeem the funds after the organization received it.

August 2016

In continuing with these themes, I wrote an article (I never published) about affordance & opportunity cost. I was trying to formalise some thoughts on why mass tokenization is valuable.

In it, I detail an idea called “Network Bonds”. You mint a bond that that pays you in coupons. This was similar in design to above. There’s only a ceiling in price that never goes down. It would have use in different scenarios. This contained first mention of potentially using the funds to go into a communal reserve.

“ETH is exchanged for the bonds and goes into the community pot. Actions coupon are spent like Reddit and attention bonds to upvote content in this community. It has a simple algorithm to weight the ranking of the content.”

Bancor Birthed (but not announced): Aug 2016.

In a talk at EDCON in February 2017, Eyal mentions that Bancor Protocol was birthed during the summer of Aug 2016. I didn’t find any other previous public mention of Bancor. This is a great talk (now that I also understand the language better)!

September 2016

During this period, I was talking extensively to Meher Roy & Maciej Olpinski: we called it the attention economy, figuring out how to use new novel signals to curate information. Meher called his designs: “joint attention networks” (or JANEs).

Here’s an excellent talk from Meher on this topic:

Also worth listening to Maciej’s ideas on content & attention:

These are still SO relevant.

Perhaps one of my favourite conversations were had during Devcon2 in Shanghai with Maciej & others, which led me to write and publish this article on the flights back to South Africa. I could combine my master’s research on information overload into my thoughts around continuous token models.

I called it: ‘Hashtag Markets’.

Notable, the issuance systems only mentioned that the cost should decay *somehow*.

“To issue a coupon, one pays with another coupon such as ETH, for example, to mint a coupon based on the rate the protocol determines. This coupon is then dispensed for actions related to that topic. The cost of the coupon changes depending on interest in the coupon.”

The ETH used to buy a coupon could go to anyone. The belief was:

“You might wonder to whom do the funds go to when buying these coupons? For each coupon bought, the buyer can choose where the funds go. It’s an additional signalling and coordination tool. The default would be for most topics to simply burn ETH. Burning implies a proof of sacrifice to be part of that “in group”. However, in some networks, having that ETH sent to someone or a group makes that network more valuable.”

October/November 2016

After that, the next iteration was called “Meme Markets”, a continuation from “Hashtag Markets”. The goal was to in detail get a protocol design down that would allow tokens to be minted around any topic or meme.

Meme Markets.

I spent quite a few nights furiously scribbling in my notebook. Had variations of designs called phrases like: “Burn Drifting” & “Momentum Grinding”. Was a lot of fun.

Notably: it was similar to previous designs, except it now had the concept of the price being determined by only how many tokens were in circulation. Tokens would be spent for services (being removed from supply). The ETH, still, would not be sent to a pool: it would be paid & disbursed to anyone you would choose. This was the hard part, to make it sybil resistant. The design included mitigations around it.

“If more people are congregating around a meme (creating & dispensing the coupons for actions), then cost of the action coupons increase algorithmically. If novelty is *not* being produced, the cost starts to decrease (algorithmically).”

9 Feb 2017

As the ICO started to rise, I thought it would be meaningful to publish more thoughts on: talking about continuous token models in general. This was quite a popular post at the time.

It had a similar design to meme markets. I just used different wording & phrasing to describe some of the ideas.

“The idea is that instead of pre-selling tokens during a launch phase, the tokens are minted as needed through various means. The tokens are then dispensed for services rendered in the network.”

14 Feb 2017.

Bancor is publicly announced, including its designs for Smart Tokens.

“In the Summer of 2016, we started working on Bancor with the goal of creating a hierarchical monetary system (where one digital token holds other tokens in its reserve) to build a new type of standard for cryptocurrencies that would lay the foundation for a decentralized global exchange. One that is autonomous, has no spread, no counterparty risk and provides continuous liquidity for any asset. One that enables the long-tail of currencies as the Internet did for content.”

They presented their work at EDCON. I had not heard of them at this point in time.

10 March 2017

It converges! This is the first time that I introduced a reserve as an option for the ETH. I became frustrated with the meme market designs: you had to dispense the token for the price to come down, and there was a lot of issues still around how to effectively disburse the funds (without introducing sybil attacks somehow).

In working with my team at Ujo, I came upon an iteration that then fit the same continuous token model used in Bancor. It was part of a design called a “Bonded Curation Community”.

“All ETH that is used to mint the tokens are kept as a communal deposit.”

“One can leave at any point by taking a proportional percentage of the deposit pool with you.”

You can still see that the goal was around using this token model for information curation. It wasn’t so much about the token model itself, but how it was used in the context I hoped to see it.

May 2017

At this time period, I became aware of Bancor.

Admittedly, I had read the whitepaper during this period, but found it quite hard to read & digest unfortunately. I came from another angle/discipline and thus the terminology & language was harder to understand. Words like reserve ratio, smart tokens, relayers, etc. I wasn’t sure if it was about creating new tokens or providing new forms of liquidity amongst existing tokens. Turns out: it was both.

Based on feedback on my previous articles, however, I planned to change the name of my designs once again, and called the new iteration ‘Curation Markets’ (from ‘Bonded Curation Community’). I published it for Token Summit in NYC. During that conference, I briefly met the Bancor team.

At this point, the underlying token minting system was the same as Smart Tokens (with one reserve currency). A difference however, was that my starting point was to have different price algorithms for buying and selling: not the same price curve (the reserve ratio in Bancor). This felt necessary so that those who would commit to participate in curating, would not immediately leave. They would only be able to exit with a profitable reward IF they actually produced value and attracted more participants to the curation community.

Admittedly, I still didn’t understand Bancor in depth due to confusion of what it was trying to accomplish. I still came from the angle of building a meme or curation market. The token model was the way to accomplish that, not the end goal. I didn’t use a specific name or word for it, except calling the whole protocol design: Curation Markets.

“- A token that can be minted at any time (continuous) according to a price set by the smart contract.
- This price gets more expensive as more tokens are in circulation.
- The amount paid for the token is kept in a communal deposit.
- At any point in time, a token can be withdrawn (“burned”) from the active supply, and a proportional part of the communal deposit can be taken with.
- The tokens are used to bond it to curators per sub-topic, who then curate information with their proportional backing.”

You can see the last point: sub-topic staking towards a curator was a core part of it. :)

During this time, I also became aware the Bancor’s codebase could be used for curation markets when a colleague, Goncalo mentioned it in the repo I created:

September — November 2017

Curation Markets started picking up as I shared the ideas. There’s always been a giant community of people who have shared ideas/feedback over the years. I remember at some point listing over 30+ people (in the meme markets whitepaper) who were kind enough to give feedback. :)

At this point, the Zap Oracles team announced their project, taking inspiration from Curation Markets and distilling the core token model down. For a while, seeing people adopt only the minting/burning model of it, I was wondering if there was a better name for just the continuous token model and not the other parts related to a curation market.

Luckily, they called it something else: Bonding Curves. I loved it. Made a lot more sense to me: just focusing on the continuous token model within it.

Notably, there’s no mention of Bancor’s Smart Tokens in their whitepaper, which leads me to assume Zap found the ideas from Curation Markets towards Bonding Curves, and not because they found Bancor’s Smart Tokens. So, in some sense, they applied their own iteration based on the ideas they found in the wild.

“Zap is introducing the economic mechanism of bonding curves for the first time into the smart contract ecosystem. No economic device like them have been released into the marketplace so far, though they have been inspired in part by Simon de la Rouviere’s writings on curation markets.”

In some articles & whitepapers, Curation Markets also refers to Bonding Curves. For example, Ocean Protocol’s first whitepapers didn’t use the word “Bonding Curves” to describe the usage of this continuous token model in their project.

During this time, Token Curated Registries also became a thing. I wrote about it and realised: this could also be called a Curation Market.

Since this time period, my goal has been to (once again) change the language. Curation Markets denotes to me: tokenized, crypto-economic curation games to produce novel signal in information markets, and Bonding Curves being: the continuous token model underpinning some Curation Market designs.

Conclusion & Thanks:

Without knowing how the Bancor team came up with Smart Tokens (since there’s less of a public trail), it seems that the ideas were co-invented at around the same time. If their designs were solidified in August 2016 (as they state), then the specific continuous token model iteration happened before I finalised the same design from my own research & work on continuous token designs. The same continuous token model (used in ‘Bonded Curation Communities’), I published a few days after Bancor was announced in February 2017 (and I wasn’t aware of it) and wasn’t called something specific until a few months later (whereupon the name, bonding curves, was given by a different team: Zap).

As always, I enjoy designing, creating and iterating in public. Names and meaning also change over time and it’s always been valuable to get that feedback. I learned this the hard way when people thought “Meme Markets” were *JUST* about dank meme trading. Changing the language over the years helped find the product/market fit I wanted. In the same way, I still think that “cryptocurrencies” doesn’t and won’t ever encompass everything a blockchain is useful for. Naming matters.

Additionally, in jotting down some of this history, I realised how grateful I am that so many people helped give feedback over the years. Thanks to the early cohort of 30–40 people who gave feedback on continuous token model designs over the years (from earliest crazy talk in 2013 to today, in 2018, where many share my articles).

Thanks also to others who have run with the ideas of curation markets and bonding curves. Notably: many of the bonding curve implementations uses the audited Bancor code for simple curve designs. Thanks to the Bancor team for that!

It’s still day one. Let’s show the world the value of continuous token models. There’s so many exciting projects coming! Kudos to the Bancor team for endeavouring along.

P.S. I reached out to the Bancor team for comments, but haven’t heard back.

P.S.S While you are here: We’re a community of 400+ people discussing curation markets and other new interesting token economics & ideas in the Curation Markets gitter. Come say hi!

Interested in receiving blog posts from me directly? Sign up for my newsletter: or follow me on Twitter: