The layer-2 (L2) scaling solutions ecosystem is booming. Can crypto achieve scalability without sacrificing decentralization?
The congestion and high transaction fees in established blockchains like Ethereum (ETH) and Bitcoin (BTC) have sparked a need for additional solutions to handle the increased demand. The L2 sidechains such as Arbitrum (ARB), Optimism (OP), and Polygon (MATIC) emerged as an attempt to enhance transaction capabilities while ensuring smooth and orderly operations.
In short, layer-2 solutions are additional protocols or frameworks constructed on existing blockchains to improve scalability and transaction throughput. They come in various forms, such as rollups, state channels, and sidechains.
They alleviate the computational load on the main chain by offloading it to a secondary layer while ideally ensuring security and decentralization.
Optimistic rollups, like Arbitrum and Optimism, take a trust-but-verify stance, treating transactions as valid unless a challenge proves otherwise.
Zero-knowledge rollups, like zkSync, perform calculations away from the main chain and then submit proof that everything checks out.
These solutions accomplish scaling by processing thousands of transactions off-chain and then bundling them into a single transaction on the main chain. This action effectively diverts the transactional load onto their parallel network, easing congestion on the mainnet.
Yet, prominent figures, including Ethereum’s co-creator, Vitalik Buterin, have recently voiced concerns about centralization and censorship in L2 solutions.
Pseudonymous blockchain researcher Andy recently took to X, stating that decentralization had been sidelined for “immediate feedback loops, accessibility, and user acquisition.”
In their opinion, the current L2 stack significantly differs from the idealized version fronted by its backers.
The growing conundrum
As the demand for blockchain scalability intensifies, many layer-2 solutions have sprung forth, offering varied approaches to tackle the scalability, security, and speed trilemma.
According to data from layer-2 watchdog L2Beat, there are currently 37 active layer-2 projects with the extensive user, transaction activity, and total value locked (TVL). 36 more are upcoming, and 11 projects have been archived.
Analysts estimate that by the end of the year, there could be more than 100 and even as many as a thousand L2s to address Ethereum’s scalability issues.
Yet, as the ecosystem expands, concerns arise about increasing centralization within these solutions. It’s a paradox: seeking to decentralize but inadvertently embracing centralization.
This concern goes beyond philosophy; it may challenge what makes blockchain robust, transparent, and resistant to censorship.
The L2 solutions offer scalability while potentially compromising the core principles of decentralization. Is this sacrifice necessary, or can we strike a balance that preserves this delicate equilibrium?
Navigating the sequencer dilemma
A key component of these L2 networks is the sequencer, which bundles user transactions and sends them to Ethereum.
Sequencers verify, arrange, and compress transactions into a package that can be transported to the layer-1 chain. For this service, they receive a small portion of the fees collected from users.
The technology plays an important role in the functioning of L2s, making them faster, less expensive, and more user-friendly.
Critics argue that today’s sequencers are usually run by centralized entities, representing potential failure points and vectors for transaction censorship. There have also been suggestions that the profitable nature of running sequencers may inadvertently discourage decentralization.
Speaking to crypto.news, Kelsey McGuire, Chief Growth Officer at EVM-based smart contract platform Shardeum, opined that the centralization of some layer-2 platforms could lead to an increased reliance on specific validators and sequencers, creating a scenario where a handful of participants wield disproportionate influence over the network.
Such a scenario could even create rifts in the crypto community between those willing to sacrifice a level of decentralization and those who see themselves as decentralization purists.
In her opinion, sequencers could have transaction ordering, thus creating concerns around front-running or censorship. McGuire suggested exclusively relying on such sequencers could lead to an industry where only a few entities have significant influence, undermining decentralization across the board.
“L2s that do care about decentralization should continue to focus on finding ways to ensure that all the power and influence doesn’t sit within the hands of just a few entities.”
Kelsey McGuire, Chief Growth Officer, Shardeum
A recent Binance report also highlighted the risks the current centralized sequencer systems pose, including the potential abuse of transaction order control and the possibility of economic harm to users. For instance, the entire L2 is impacted if a centralized sequencer fails.
Some L2s also lack fraud proofs, although others, including the popular Optimism rollup, are currently developing such systems.
Fraud proofs are layer-1 algorithms that validate the accuracy of layer-2 transactions. Many rollup networks “borrow” Ethereum’s security through these fraud proofs, enabling Ethereum validators to verify that an L2 network is functioning correctly.
Some analysts have suggested that without fraud proofs L2 networks are essentially asking users to trust their security measures instead of Ethereum’s.
Other L2s also lack what experts describe as an “escape hatch” for users to transfer their funds back to Ethereum if a sequencer fails. Without this, there’s a risk of users losing their funds if something goes wrong.
Ethereum’s centralization issues extend beyond L2 centralization. Its transition to the proof-of-stake (PoS) consensus mechanism created new centralization headaches for the network.
Under PoS, network validators are chosen based on the amount of staked ETH they have. It has led to hyper-scale staking platforms such as Lido, which currently houses as much as 20% of Ethereum’s total locked value (TVL) in its liquid staking instrument, the LSD.
Lido also operates one in every three Ethereum validators, leading many to question the excessive dependency on such centralized staking platforms, which ultimately contradicts the Ethereum community’s ethos of decentralization.
The solutions in place
Several solutions are being proposed to address these centralization issues. Shared sequencers and direct decentralized sequencers are some of them.
Shared sequencers are networks serving multiple L2s, promoting interoperability and composability. In contrast, direct decentralized sequencing allows each L2 to have its own set of sequencers, allowing for more customization and control.
There are reports that Coinbase and other rollup platforms plan to adopt decentralized sequencers, even as fears abound that large-scale implementation of the technology may compromise speed and security.
L2 platforms like Espresso and Radius are currently developing shared sequencing solutions, each with unique features in their respective architectures.
McGuire, who believes sharing is caring, at least as far as decentralization is concerned, thinks the shared sequencer route may be the best way forward in the L2 space. She feels that a number of the challenges facing L2s could have been negated had the solutions been baked into the underlying L1s from the start.
In his post on the Ethereum Magicians forum, Vitalik Buterin introduced a tiered framework, ranging from stage zero through stage two, to systematically evaluate the level of decentralization inherent in various L2 networks.
This framework acknowledges the practical necessity for nascent L2s to temporarily employ certain centralized mechanisms—akin to “training wheels”—that ensure a secure testing phase and a controlled public roll-out before full decentralization is achieved.
As the crypto community grapples with the centralization problem, the future remains uncertain yet hopeful. Innovators actively address these concerns, exploring novel architectures that balance efficiency with decentralization.
The road ahead involves iterative solutions and learning from the successes and pitfalls of existing L2 frameworks.
The conversation is dynamic, evolving alongside the blockchain landscape. The challenge is clear: to forge a path where scalability doesn’t compromise the decentralized ethos.
The community could collaboratively shape the future, steering toward solutions that align with the core principles of blockchain technology.
In the grand narrative of blockchain scaling, the centralization subplot is a critical chapter that will undoubtedly shape the destiny of decentralized networks. The question remains: can we scale without compromising the soul of crypto?