Layer 2 Playgrounds
L2s are evolving quickly, and are now very easy to try. Here’s a little product guide to L2s available for wide adoption.
- Why L1 → L2?
- Product dimensions
- Dimension 1: What’s an L2?
- Dimension 2: Validation Method
- Dimension 3: Market Specialization
- Dimension 4: Ecosystem Progress
- Important Limitations and Risks 🚧
- Further Reading, Endnotes, Disclaimers
- ⭐️ More: Super Quick L2 Quickstart ⭐️
This post is about layer-two (L2) scaling, with a focus on Ethereum. I write for the beginner-to-intermediate blockchain enthusiast—someone with experience transacting on Ethereum or another main chain, but still less aware of L2 and its progress.
The focus of this post is to describe L2 solutions as kinds of products. In another post, I’ve argued that thinking in these terms is helpful. Product-thinking can help to sharpen scrutiny and consumer caution. Products compete. They have favored or disfavored features. Thinking in these terms could encourage careful weighing of potential costs and benefits. Product assessment and comparison can happen when we search for virtually anything, from a fresh pen to a new laptop. We sort, save, scrutinize and filter. Why not structure our decisions like this for all crypto products?
As a product class, the second layer (L2) is exciting. To start, you have to choose which L2 to send your L1 (Ethereum) assets. This is called “bridging.” With the recent release of some L2 solutions, many thousands of participants on Ethereum have bridged funds to some chosen L2 (see illustration below). It’s entering a world in parallel with the main chain and its offerings. When you transact, it can feel like blockchain teleportation — transactions fly with little delay or cost.
With The Merge coming soon to Ethereum, and questions about scaling at the forefront, let’s start with why L2s are an important new product category. It’s a familiar story, but an important one to revisit.
Why L1 → L2?
On a particular weekend in early May 2022, someone sat down at their computer excited to transact on Ethereum. They may have planned to trade some tokens or pick up a mint. Maybe they took a sip of warm coffee, opened up their browser and plugged in their shiny hardware wallet, ready to go. They setup Uniswap to do a simple trade, and clicked “Swap.” They signed their wallet with those mildly audible clicks of its little buttons. Done. But when checking the chain on Etherscan, they gasped: Their swap of $282.60 cost $183.57 in fees.
This is a familiar story to those calloused by fees. But on that weekend in May it was historically bad. Someone paid over $4,000 in fees to secure a Purrnelopes NFT (price: $1,000). Another spent over $400 in fees just to send $30 in ether. The base fee went above 8,000 gwei, the highest ever. Something was awry. The cause, of course, was a certain NFT mint.
That weekend served as further evidence that proof-of-work blockchains can face serious scaling challenges. It is not unique to Ethereum. A famous example in Bitcoin’s history was when Erik Voorhees popularized Satoshi Dice, a gambling game on chain. Launched a decade ago, Satoshi Dice was a popular game for many, but other Bitcoiners were up in arms about it. It choked the network, leading to high fees and slow transactions. Such events in Bitcoin’s history inspired its famous block-size war and the development of Lightning Network.
What can protocol architects do about this?
Some have designed new protocols and chains that overcome these challenges using new consensus mechanisms. This is because scaling is in tension with the goals of decentralization and security: a trilemma. Blockchains have historically maximized these properties of security at the expense of scaling. So another strategy is to use that security as a basis for new development. Why not build on top of that L1, creating a “second layer” (L2) that is less constrained by these scaling issues while leveraging the security of the L1?
This post is about this emerging ecosystem of L2 solutions, focusing on Ethereum. Like my prior post about stablecoins, I will portray L2s as products that help users scale their on-chain activity. Let’s think of them as having particular product dimensions. These dimensions can organize how we think about L2 scaling and the products available for us to try.
Dimension 1: What’s an L2? The first dimension often discussed in L2 solutions is the definition of L2 itself, because it has particular technical and security implications. An L2 is often defined as a ledger that is directly tied to the security of the L1 ledger. It helps to think of the consequences of this definition in practical, functional terms. On Lex Fridman’s podcast, Vitalik Buterin offers an example for a type of L2 called rollup¹:
There is an important difference in security models between rollups and side chains which is basically that rollups inherit from the security of Ethereum so if I have coins inside of Loopring or Optimism or Arbitrum or ZkSync then even if everyone else in the world who is participating in these ecosystems hates me and wants to steal my money I can still personally make sure that I can get my money out.
So the first distinction in product terms is that a “proper” L2 has cryptographic assurances that protect your assets. These assurances are tied directly to the main L1 chain (here, Ethereum). As a product, this gives L2 platforms the feel of still working within Ethereum’s wider ecosystem.
But if the technology and its user interfaces and communities are evolving, what other aspects of L2s should we consider?
There’s a great new resource to help with these questions. The L2BEAT research team has created a rich resource with real-time data updates on L2s for Ethereum. There are detailed explainers on this site, including information about each L2. From their FAQs:
We had to draw the line somewhere and — in the current version — we define L2 as a chain that fully or partially derives its security from L1 Ethereum so that users do not have to rely on the honesty of L2 validators for the security of their funds. This is in line with the current view of ethereum.org on what layer 2 scaling is.
Dimension 2: Validation Method. Under this definition of an L2, there are several methods for securing assets while tethering to L1. The most common at the time of this writing is the “rollup” approach. Rollups are pretty easy to understand intuitively. An L2 combines transactions together into batches by “rolling them up.” This allows data savings and speed up. This speed up can be especially fast because of technical developments in zero-knowledge proofs. Because these proofs allow us to prove (or falsify) the validity of a whole batch of rolled up transactions, there can be great savings in the efficiency in data storage and speed.
There are two main ways to validate a rollup², and they may seem very different in what they offer as products. One is called an “optimistic rollup,” and the second a “ZK rollup.”
On an L2 that uses optimistic rollup, transactions wait on the L2 for a period of time (the challenge period). There is a process for adjudicating any potentially bad behavior by allowing participants on the L2 to prove that one or more changes have been made fraudulently. In an optimistic context, if a batch of transactions has survived the challenge period, then it is indirectly “validated.” A second type called “ZK rollup” validates a bundle of transactions as they are being rolled up. This has the benefit of more quickly confirming their soundness, but it can be more computationally expensive for the L2. Vitalik Buterin has a great summary of these two validation types³:
By volume, the two largest L2s at the time of this writing, Arbitrum and Optimism, use optimistic rollups. This means that when you transact on their chain, a final L1 commitment of the data will not be completed until the challenge period has passed (7 days). There are exceptions to this rule, because tools can be devised that skirt the wait period for some things (like bridging). The next two largest use the “ZK rollup” method: dYdX and Loopring. In these L2s, commitments are carried out immediately (though, in practice, it can still take minutes or hours to bundle the transactions and commit their associated data to the L1 ledger).
A possibly unpopular take on all this: In neither case does this really deter the user while they are conducting business on L2. This wait period is only about commitment to L1. But from the “perspective” of the L2, for both types of rollup, your transactions seem to be processed almost instantly and for gas prices at a fraction of the cost.
But if you are coding an app that will communicate between L1 and L2, this dimension may be more important. With an optimistic rollup, a contract on L2 has to wait the 7-day challenge period before L1 gets full confirmation of its updates. But if you’re a regular user on an optimistic L2 without such technical considerations, mostly you don’t notice any of this. Even when seeking to deposit or withdraw from an optimistic rollup, there are now third-party services that help get around the challenge period.
So the next two dimensions may be more important from the perspective of a consumer assessing these products.
Dimension 3: Market Specialization. Consumer psychology suggests that one of the earliest parts of product assessment is problem recognition. This involves specifying why a given product is of interest and what particular goals will be pursued with it. Being explicit matters. In the case of L2s, I’ve experimented with several, and an immediate question that you ask as a potential user is what kind of transactions you want to make on that layer.
There are two general types. Borrowing again from the great L2BEAT, there are second layers building universal functionality, and second layers seeking specialization.
- Universal: Several chains, including the two largest Arbitrum and Optimism, aim for universal coverage—full EVM equivalence, like a second Ethereum that is faster and cheaper and stays enchained to its L1. If you are releasing an NFT project or a token or any specialized smart-contract system, you can consider universal L2s as a potential platform for easy deployment. These platforms have elaborate “ecosystem” sections of their website: NFT marketplaces, Uniswap for second layer, DAOs, oracles and so on (see Arbitrum’s and Optimism’s).
- Specialized: Interestingly, the next largest L2s, dYdX and Loopring, use ZK rollups and are more specialized. They both focus on serving as an exchange. Their platform is geared towards particular kinds of transactions that they speed up and cheapen, and in the world of DeFi fast settlement is key. Immutable X is an example of moving NFTs to a validium L2 (basically a ZK rollup but with its data off chain²). Immutable now hosts several large projects, including the famous Gods Unchained card game.
When a second layer is designed for market specialization, you might find its ease of use and penetration in that market segment most desirable. Answering that initial question about why you’d like to engage the second layer will help consumers identify candidate products. If you are interested in DeFi activities, the range of best options may be different than if you’re looking to trade some NFTs.
Dimension 4: Ecosystem Progress. The second layers mentioned above have evolved significantly since their initial announcement. Just recently, Optimism made a major project announcement, and some ZK rollup platforms have announced significant advances in cryptographic tools for their platform.
One major issue is the progress of these systems and how many they onboard. We have those two perennial principles of preferential attachment and network effects here: The more people who use L2s (and the more often they use them), the more desirable the platforms will be perceived and the faster they can continue to grow. Assessing L2s for this depends on some basic raw ingredients of ecosystem progress:
i. Volume on the platform
ii. Unique users on the platform
A consumer assessing these products has to balance the size of a given user base with the specific functionality it’s looking to achieve. A finely tuned platform for a specific purpose (like an NFT platform) is not useful if there is no one else on it. L2BEAT again can offer some idea about the activity on these chains (though imperfectly: by volume locked). Here’s a great Dune dashboard by gm365 with lots of related metrics for four L2s.
But even if there is little volume, we can ask how promising a platform is based on a number of other basic features of software products. These can shape the likelihood that the platform will attract other customers:
iii. UX for bridging, transacting and withdrawing
iv. General UX of the platform and hosted apps
The UX issue with L2s is not only the movement of your assets across the bridge. There are independent teams building new apps and tools for these L2s. They also have UX issues for you to consider. Some apps for these platforms have been deploying on many second layers. For example, the tofuNFT NFT marketplace can be used to transact on many chains and second layers. Any such app will have its own UX feel and we may end up focusing on an L2 for a subset of them.
Each L2, and each application, further shrinks the user base and liquidity from the L1. This is a major limitation as we move forward in this ecosystem. The technical challenges to bridging and UX are key, but just as important is the extent of activity and the number of unique users on an L2 platform engaging with hosted products. This kind of bootstrapping is an ingredient that depends in great part on us—to take the dive into L2.
Important Limitations and Risks
The above are the primary product dimensions: security of L2, validation type, market specialization, and ecosystem progress. You can highlight the positives of various offerings, sort and sift, and make a product selection. But there are critical limitations and risks that should be held in mind, too. Some of these pertain to the entire L2 ecosystem itself. Here are a few important issues these products are sorting out.
Impressively, L2BEAT lists potential risks for each and every L2 they list. They do so on a tab that lets you compare the roster of L2 options:
Here’s a typical listing of risks for second layers that use the optimistic rollup method. At the moment, we have to assume there is at least one honest, capable validator, that they are not engaging in MEV practices, and so on. These are reasonable assumptions. But as with all things in crypto, we prefer not to have to make them.
The very recent Nomad bridge drain illustrates risks with bridges. It may have been caused by a contract update, and permitted a drain of the bridge contract of its entire batch of assets (about $190,000,000). Also some of these products continue to be relatively centralized. For example, some of the major L2s have one sequencer (validator), and they have future plans for decentralized upgrades. Centralization is a serious concern too. The Axie Ronin bridge was compromised by engineering an attack on the signers of its multisig contract.
These risks are therefore significant. The most secure location for any asset is on L1. In some ways, the L2 could be regarded as a playground in which you carry out bouts of faster, cheaper engagement with an ecosystem that interests—this may be especially true for DeFi. DeFi and ERC20 tokens (for example) can cross bridges. Then, when ready for longer-term holding, you return your assets to L1. This can be more difficult for NFTs. The “native” NFTs on L2s reside there. Fractionalized and wrapped NFTs may allow participation in this way on L2 as well.
- Strong recommendation is to start with L2BEAT. Their FAQ is punchy and perfect to get into a bit more detail than presented here. Also check out the associated L2Fees for information about transaction cost.
- ⭐️ Here’s a super quick quickstart to getting started. ⭐️
- Georgios Konstantopoulos has an amazing technical introduction to optimistic rollups @ Paradigm (see also the rollup breakdown here).
- Vitalik Buterin has a definitive guide to rollups and relating them to other L2 methods.
- Galaxy Research has an elaborate and clear summary of the L2 ecosystem here. Other great ones on L2s and rollups: Quanstamp, Amber Group, Matter Labs, Immutable X, Delphi Digital
- Nazar Ilamanov has a nice technical summary of L2 (Optimism focus)
- Ethereum.org’s scaling article is very clear and includes excellent video pointers.
- The situation is more complicated than this though. If everyone using and supporting a given L2 hates you, then they might cease to support the L2 and close down all its user interfaces. For most people, engaging the L2 directly would be almost impossible without special training. Vitalik goes into some detail in that podcast about these tradeoffs (for example, side chains may have more freedom to fashion their protocol for usability and so on).
- Another feature of these validation types is whether data about these transactions is stored on the chain or off (known as the dimension of data availability). I will ignore this distinction for the purpose of this illustration, but there are excellent summaries of the distinction on L2BEAT, along with a great new post illustrating the complex space of rollups at Delphi by Jon Charbonneau.
- There has recently been some discussion about Vitalik’s estimates and the current progress in optimizing L2. See discussion in this great thread from Sanjay Shah.
This article is only to share my opinion and shouldn’t be regarded as advice about anything at all except maybe fun (but maybe not even that, at least not for everyone). I’m on Twitter. I was not compensated for this survey. I wrote it for fun. I hope you found it useful.