Tokenization? Negative, Ghost Rider.

By Rob Seger

Tokenization. It’s a horrible idea. But don’t take my word for it, yet.

Token a What?

Tokenization is a strategy for creating a record of something that exists or happened “off chain” and recording it “on chain.” Creating a record that I sold you a car, for example, and storing this record in the bitcoin blockchain is a tokenization strategy. In this way, that record is protected by the blockchain because it will exist publicly and forever.

Broadly, there are two flavors of tokenization: asset (adjective/noun) and transaction (adverb/verb). The example of my selling you a car could be done as either.¹ With an asset tokenization strategy, a token representing ownership (adjective) of that car could be created on the blockchain and then transferred from me to you. A transactional tokenization strategy might be to simply record the ID number of the legal contract we signed in the blockchain for all of posterity to see.

This turd done been polished!

And there are two coats to the polish. We begin with a woefully naive assumption about what can and cannot go wrong in this system and spitshine that up with some willfully misleading language. We’ll peel those off in reverse until it becomes obvious that anyone peddling this turd is either a charlatan or an idi.. — person who hasn’t really thought this through.

The Language of Tokenization

A record of: A small pile of bits we both agree mean something. Generally this relationship is saved somewhere so we don’t forget it 20 years down the road.

Off chain: Anything you can hit, or otherwise change, without needing to mine a block of the blockchain.

On chain: Those things which can not be changed without mining a, or several, blocks of the blockchain.

Protected by the blockchain²: That small pile of bits is in the comments section of a transaction in a valid block.

More plainly, the asset tokenization strategy from above could be stated thusly³: You and I agree that the bits 01101 represent ownership of my car. We agree that if I write “01101 Transfer From <my address> to <your address>” in a block’s comments section, you now own the car. I do that. We also jot “01101 means Rob’s car” on a sticky note and stick it to the side of my desk so we’ll remember. Fine, fine. We probably write it in a database somewhere.⁴

The transactional tokenization strategy from above could be similarly translated. You buy the car from me the normal way. We agree that 01101 represents the contract we just signed. One of us writes 01101 in a comment section. We scribble “01101 is this contract” on a sticky note that we attach to the contract before stuffing them both into a drawer.

Fan, Sh!t. Sh!t, Fan.

One of the things advocates of the blockchain are quick to assert is that the blockchain is unlike any other storage medium ever. Unlike any other option, things written to it will remain unchanged for all time.⁵ For our purposes here I want to accept that, fully and totally. Unlike any other option, things written to the blockchain will always be easy to find and never change.

What happens when that sticky note, that database lookup, is corrupted? Or, worse, that contract is burned? Let’s explore, shall we?

Take the transactional tokenization strategy to address the throughput issues of public blockchains, for example. Where you create a record (token) of hundreds of transactions done off chain to be recorded on chain for security purposes.

Creating the token is a simple matter. There are a number of strategies but let’s just assume a simple hash of the transactions. In reality, all we really need is to be able to independently recreate the token if, and only if, we have all of the original transactions. Hashing just happens to be a very reliable way to do that.

As a somewhat, mostly -ish, reformed hacker, all I can see in this system is how much damage could be done by changing a single bit. In a normal system, a single transaction would be affected because a single byte would be different due to that small bit rot. Worst case, that invalidates the transaction. But in a system ‘protected’ by the blockchain?

That single bit change will completely change the hash one gets when they attempt to verify the off chain transactions. In other words, the affected transaction, and every other transaction which occurred during that same tokenization period, will no longer be protected by the blockchain. They will, in fact, be invalidated by it. Bit of a knife in the back, really.

The regulatory and other legal implications of discovering an invalid transaction vary from industry to industry but for the financial industry in particular, the ensuing legal obligations tend to have cascading and truly burdensome consequences. A single invalid transaction can cost tens to hundreds of millions of dollars to rectify.

Now imagine multiplying that burden by ten minutes worth of transactions being invalidated simultaneously. It’s impossible to say exactly how many transactions would be affected but if we assume a very small average of one transaction per second (86,400 per day), the burden for a single bit change would be the legal obligations of 600 invalid records. Or, tens to hundreds of billions of dollars.

But that’s not even the best part! Amazon touts its storage reliability at a truly impressive 99.999999999%. Meaning, they expect a single bit of every 12GB stored with them to rot (be corrupted) every year. Normally, that’s a laughably small level of corruption. Tokenization, however, means that Amazon is expecting you to find at least 10 minutes of transactions invalid, every year.⁶

Sadly, tokenization really does represent the worst of all worlds. The system of systems has been significantly complicated by adding an entirely new attack surface (the token gen and lookup). And an already existing vulnerability (bit-level alteration/corruption) has been multiplied to absurdity. With no practical positive implications outside the hype of being “protected by the blockchain,” it’s hard to imagine a worse architectural choice.

Building a solution around tokenization is tantamount to building a house of cards, with snowflakes. Don’t. You’ll thank me later.


1. I posit that, because anything of import requires both a verb and a noun, either flavor of tokenization can be applied to any situation. This is neither useful, nor important, but I’m not the one still reading this note, am I?

2. May I just say that this is one of the most beautifully deceptive phrases in the blockchain space. Kudos to whomever coined it.

3. Accurately.

4. I feel like my disdain for the practical difference between a database and a sticky note here isn’t popping, you know? I’m sorry, I’ll try harder in the future.

5. Someone really should compile a list of all the technologies to have claimed this.. Quick, fetch me an intern!

6. Assuming only one transaction per second and that a transaction takes 380 bytes or more to record.


Rob Seger is CTO and Co-Founder of Manifold Technology. Rob has two decades of experience focusing on security, network and cryptographic exploitation. Rob began his career in the government before becoming CTO of Morta Security, later acquired by Palo Alto Networks.