Reducing Matching Engine Market Power with Decentralization

How Blockchain Based Data Markets Level the Playing Field

By Stephen Fiser (with lots of help from Liam Kovatch)


In an article for the MIT Cryptoeconomics Lab, Cathy Barrera, PhD discusses the impact that Blockchain technology could have on the market power of tech companies.

From her article:

Market power arises when users or customers have few comparable alternative options for sources of the good or service being provided. This gives the seller the ability to raise prices, or in the case of some internet giants to charge transaction fees, compile and sell user data, all as a condition for giving users access to the platform.

I heard Dr. Barrera give an interview elsewhere in which she discussed the three elements of any transaction:

  1. By some process, a buyer and seller are matched (or find each other)
  2. The buyer and seller agree to transact at a particular price
  3. The buyer must submit a payment, and the seller must deliver the agreed upon goods or services

Market power can be seen and evaluated at each step. For example, banks and credit card companies currently hold a great deal of market power with regard to payment processing (step 3 — which cryptocurrencies obviously aim to disrupt).

Likewise, in any kind of marketplace, there can be large companies that maintain market power for steps 1 and 2 on the basis of data control. Collecting and controlling the data empowers them to create a walled garden where they are the sole “matching engine”.

When you order a car in the Uber app, only Uber drivers can see that request and be matched to you, even if a Lyft driver is 10 minutes closer. When you swipe through people on Tinder, you’re only seeing pictures of Tinder users.

This is a bit obvious, because it’s how everything in the digital world currently operates. What would be the alternative?

What if when you ordered a car, that information was transmitted to a decentralized network of drivers — all using different apps?

For a moment, set aside the fact that no market incumbent is going to share their data, and let’s also set aside the nuanced discussion around data privacy. Let’s examine the mechanics and come back to the business considerations.

Some History

0x pioneered the idea of maker-taker transactions on the Blockchain. The idea is essentially to have a “maker” create an order (offer to trade some tokens) off-chain and sign it with their private key. This lets us prove that they are in fact the author of the order. Then, at some later point, a “taker” can execute the transaction on-chain. This means there is only one Blockchain transaction instead of two.

This improves performance in two ways. First, it cuts the number of on-chain transactions in half (a 2X improvement). Second, not every order that is made is taken. Therefore, since orders are made off-chain, every market using this model gets some amount of performance improvement above 2X just by the nature of not all orders being fulfilled.

This lead to the idea of a “relayer” — someone who tells people about orders. This is typically a website (like Radar Relay) that keeps an order book. You can place an order on their website, and they will show your order to other website visitors who may choose to fulfill it.

In this context, both steps 2 and 3 of the transaction are decentralized, but the first step is centralized because the relayer controls the data. Projects like Paradigm aim to decentralize step 1 (relayers) with their Order Stream network. In Global Liquidity, Motivated, Paradigm CEO Liam Kovatch argues:

Global liquidity can create liquidity pools larger than that of any single exchange, and promote a new level of efficiency and liquidity within exchange markets.

In effect, this is an argument for reducing the market power of any particular relayer. This is achieved by developing a decentralized network of validator nodes. Validator nodes look at the contents and signature of orders, vote on their validity, and broadcast valid orders over the web to anyone listening.

Decentralized Realtime Data Markets

Now that we have a bit of background, we can apply this concept to broader topics. Consider the following workflow:

  1. User submits a request for a ride from point A to point B into an app
  2. The app they are using broadcasts the request to something akin to the Paradigm Order Stream
  3. Many different listening apps receive the request, and try to match the request with a driver
  4. The user is presented with a list of options in the app they are using
  5. They choose a ride and make a payment

(Depending on the context and model, the data could be private and only accessible via purchase, or it could be completely public.)

In a data-sharing model like this, you’d expect to see fewer giant tech companies and more small to medium sized competitors. This is because any new company can jump into the market and have one side of market liquidity provided immediately. You are pushing a lot of the market power to the network itself and away from any single company.

One of the biggest challenges that marketplace startups tend to face is that they need to acquire both sides of the market as users. This is a bit of a chicken and egg problem. If there are no riders, how do you get drivers? If there are no drivers, how do you get riders?

With a decentralized data stream, if you want to build an app for drivers, you could immediately tap into a stream of requests for rides that your users could fulfill — your app could be useful from day one for even just one user.

The Road Ahead

The remaining concern that needs to be addressed is how to incentivize companies to give away their data. I think it’s safe to say that big tech companies simply won’t do this. If this ever takes off, it will likely be in the domain of startups.

That said, an incentive model needs to be developed which rewards companies enough that they join a network, and once they have reaped benefits, discourages them from leaving. This could occur naturally if requests from external apps make up a significant portion of transactions. You’d benefit from this in the beginning, and many would likely become dependent on it.

Liam also points out:

The key for this to work is a two sided incentive structure focused on encouraging data sourcing (from makers) and data demand (from takers). In both cases these parties can be considered “originators”.

What incentives do these entities have? Well, signing orders allows us to include “relayer fees”, which can easily be thought of as “originator fees”. Entities creating taker demand can also define affiliate fees for users of their interface. Together these two fee schedules can combine to create two-sided network effects around a decentralized real time data market.

Finally, there could also be additional layers like TCR systems where stakeholders can vote people in and out of the network — so if someone isn’t playing by the rules, they can get kicked out.

My thinking will be evolving on this over time, and I’ll likely be writing more about it.