2020 Focus Preview: How to Solve the Scalability of the Public Chain?
Written by: Wilson Withiam (Cryptocurrency Data Platform Messari Research Analyst)
Compile: Zhan Juan
Source: Chain News
This article is a paid report of the Messari Pro Research research platform, and Lianwen was authorized to translate and publish it.
- Looking for the miner "to1475": Filecoin's public miners arena
- Research Shows: Quantum Hegemony Impact on Bitcoin Is Still Too Large
- Analysis: What is "double digging"? Not as simple as you think
Bitcoin is slow by design.
Blockchain is not necessarily created for speed.
Satoshi Nakamoto put security and decentralization first when designing Bitcoin. He also acknowledged that the Bitcoin system will face pressure when competing with centralized competitors such as mainstream credit card networks. In fact, Bitcoin's 10-minute block time and smaller blocks allow it to maintain a global, leaderless network consensus, because doing so reduces the frequency of unintentional ledger forks, and Low cost to maintain node operations and low storage requirements.
Bitcoin is slow: in the past year, it has processed fewer than 4 transactions per second (tps). Moreover, Bitcoin cannot easily expand its throughput, which is already evident in the multi-year governance struggle around the "block size". The culmination of this struggle was at the end of 2017, when bitcoin cash became the network's first major hard fork. Setting a hard size limit for Bitcoin blocks has made the term "digital gold" dominate ever since.
Nowadays, most investors seem to be content with Bitcoin as a low-throughput system, and value its strong settlement guarantee for high-value transactions rather than care about its support for global payment networks. The reason is obvious! The trading equation shows that once the speed (that is, the speed at which funds are traded) increases, it will put downward pressure on their prices.
What about Ethereum?
Similar claims are now emerging on Ethereum. This so-called "world computer" project is making three difficult choices, whether it focuses on high-value transaction settlement or decentralized applications that focus on high throughput. In short, Ethereum is now only related to Decentralized Finance (DeFi).
This transformation is inevitable. Ethereum became very hot in 2017, and its weak processing power (average 7.5 tps) could not keep up with the rapid growth of transaction volume. The continuous backlog of unconfirmed transactions has led to soaring transaction fees and further delayed settlement times, making it difficult for Ethereum as a platform to meet the needs of specific game applications and exchanges.
The most famous example is the CryptoKitties crisis, a game of collectibles on the chain. After its launch in late 2017, it was too hot, and soon it brought Ethereum to a standstill and shame. To cope with this dilemma, blockchain-based games and gaming applications later chose other smart contract platforms with higher performance and friendlier fees, such as EOS or Tron. Even the original creators, the creators of the crypto cat and the creative Dapper Labs, raised $ 11 million to create a competitive network with the goal of removing the transaction bottleneck that plagued Ethereum.
When Bitcoin encountered an expansion problem, it evolved from payment to settlement.
In the same challenge, Ethereum has evolved from a world computer to DeFi.
It seems that the next generation of blockchain projects should not overlook the obvious fact that high-throughput cryptographic applications generally lack a hosting platform, which opens the door to opportunities for new projects. But who can solve issues such as user experience and incentive allocation? There may be two short-term solutions. One is to convince customers that unregulated financial and data services are more worthy of consideration than current solutions (behavior change), and the first is to provide an environment that defeats existing systems in performance (technical progress) .
Changes in consumer behavior require time and education to achieve the desired results. In our world of instant gratification and limited time, this method is inefficient. In response, most developers and investors choose to solve technical problems in order to build a scalable blockchain network.
Some high-throughput options
If you analyze recent financing projects, you will find that scalability is the preferred strategy to win users. In our sample, investors have invested more than $ 2.4 billion in Ethereum alternatives that focus on superior performance. However, 75% of these "super chains" have not yet been launched, and other online systems have not fulfilled their performance commitments.
Main layer 1 high-performance blockchain projects
This round of investment boom also extends to "second-tier solutions." These projects share some computing and data storage functions from the underlying network, and have improved performance in terms of transaction volume per second and node operation costs. Investors in the second-tier solution are betting on the vertical expansion of the blockchain network, not the horizontal expansion.
Layer 2 scalability solution
Despite the massive capital injection, scalable blockchain solutions remain difficult to name. A lingering question is, since all these funds have been invested in the research and development of blockchain expansion, then why hasn't a dominant solution emerged?
Larger scale, more questions
It is very difficult to scale a network based on a decentralized infrastructure and a community of node operators. This is because the relationship between decentralization, security, and scalability is very entangled. This is sometimes called the "scalability impossible triangle", which makes the blockchain system unable to have these three characteristics at the same time.
All existing network projects require trade-offs. Bitcoin and Ethereum chose to prioritize decentralization (low node operating costs) and security (high attack costs) over scalability (high transaction volumes per second). Alternative projects, such as Binance Chain, optimize performance by controlling the number of accessible nodes and sacrificing decentralization. Other alternatives have chosen to sacrifice network security in part.
Facing the "scalability impossible triangle", different choices for different projects
Since computing performance is a function and a leading indicator that is adopted, any effort to try to solve this trilemma can make people discover what strategy is likely to expand the blockchain network. Many past projects often overlook the relationship between parameters that affect the ability of a network to scale. In addition to the number of transactions per second, the parameters that should be considered include the number of validators, the cost of the transaction, and the time required for finality.
Early attempts at scalability were mostly awkward, trying to tinker with one parameter without affecting other parameters.
Bitcoin Cash and its BSV fork increase the block size by changing the bitcoin code so that each block can contain more transactions. The increase in Tps comes at the expense of decentralization, because this "upgrade" places higher requirements on the data storage of node operators. Both Bitcoin Cash and BSV advertise their low transaction fees, but in fact, low transaction fees will reduce the income of miners and pose a long-term survival threat to the security of the network.
Another way to control operator costs is to strictly limit the number of verification nodes. This method is very common in DPoS networks, such as EOS and Tron. In these projects, a selected group of nodes are combined to control all voting rights. This set of validators is small, so the network can agree on transaction order and verification in a short time, but the system also abandons the characteristics of decentralization and anti-censorship.
Looking for more promising solutions
A more promising first-tier solution recognizes that the scalability parameters listed above need to be expanded in a collective manner. These methods involve greater technical complexity and therefore require longer research and development. Some of these experiments include sharding, separation of time and state, and interoperability.
Projects like Ethereum 2.0, Zilliqa, and NEAR are considering sharding to address scalability issues. The sharding model divides the network into different groups (called shards), and nodes only need to verify the transactions on the shards they are in. This "divide and conquer" approach enables each part of the chain to process their respective network transactions in parallel, thereby improving performance.
Each node is also responsible for processing only a small portion of the total transactions in the network, which in turn reduces operator costs and maintains decentralization. Despite its benefits, sharding is an extremely painful engineering problem, and its ability to keep running under high pressure is still questionable. Although Zilliqa's mainnet claims to implement a real-time sharding architecture, the real test of sharding should happen during the final stage of Ethereum's launch of Serenity.
On the other hand, Solana abandoned the sharding strategy and chose the method of separation of time and blockchain state. Its network has an embedded mechanism called Proof of History (PoH) that synchronizes the order of transactions in real time. PoH eliminates the need to reach consensus on transaction ordering, which eliminates the corresponding time loss and enables nodes to process them immediately when they receive transactions. However, like sharding, PoH has no precedent and is still in development. Solana's team also found it difficult to reach the expected upper limit of tps (approximately 40,000-50,000) in internal testing. As Solana switches from an internal test site to a public testnet, it will face even greater challenges.
Another cutting-edge problem with scalability is interoperability. As an intermediary, interoperable solutions can facilitate the transfer of information between different first-tier networks. This cross-network connection can also promote the mutual use of various chains and outsource performance or security requirements. In a sense, the current Ethereum can hand off some processing tasks to a higher-performance chain and achieve scalability through "parallel computing". The network receiving these jobs will benefit from Ethereum's security level, token liquidity, and user base. Although the idea of cross-chain collaboration is exciting and projects like Cosmos and Polkadot are in progress, the idea of whether interoperability can solve the scalability problem has not been proven.
All these attempts to solve the scalability of the base layer involve extremely complicated technical solutions. The complexity is that it is difficult for people to well coordinate various parameters (security, decentralization, transaction costs, etc.) of a blockchain design in a large-scale situation. This daunting challenge has prompted some projects to find viable alternatives.
Finding answers to the above questions
The second layer solution, which exists on the blockchain network, is usually connected to the underlying platform through a two-way hook or a special smart contract. These two-way channels provide an exit for the base layer to ease some computing responsibilities and ease various transaction congestion. We can think of these second-tier networks as pressure reducing valves for some high-transaction, low-throughput blockchains.
When the base layer gains performance, the layer 1 / layer 2 relationship is completely mutually reinforcing: most layer 2 solutions rely on their underlying network for security and dispute resolution. This dynamic relationship allows greater flexibility in the design of the second layer, and teams can trade security (and sometimes trust minimization) for scalability without sacrificing the decentralization of the base layer.
LightningNetwork is the most prominent example. It is a payment channel system built on the Bitcoin blockchain, enabling users to conduct transactions in a low-cost, high-throughput environment. Users can regularly record Lightning Network transactions on the Bitcoin network for settlement without having to submit every transaction to the chain, which reduces the overall workload of the Bitcoin network. Although Lightning Network and similar projects (such as state channels) are theoretically exciting concepts, the scope of application of these projects is relatively limited: payment and state channels contribute little to the execution of smart contracts or more complex operations. As a result, some projects have begun to explore other second-tier extension solutions, namely sidechains and rollups.
Sidechain is an independent network, usually with a unique consensus layer, connected to a base layer protocol through a two-way hook. Because there is no burden on the first layer design, the side chain can support certain features beyond the capabilities of its base layer, including but not limited to scalability and interoperability, while not relying on the storage of the first layer. Despite these so-called benefits, these branch networks require more trust and collaboration among participants, which is a more difficult requirement for believers in the crypto industry.
This dilemma can explain why sidechains lack demand. Loom Network is trying to force its adoption rate by accessing multiple first-tier networks (now Ethereum and Tron, and Binance Chain in the near future), hoping that a certain effort can dig a gold mine. Even those sidechain projects that have experienced more, such as POA Network's xDai chain, have significantly lower rankings in terms of activity and daily active users than the applications at the base layer. A more convincing argument supporting sidechains is that if the market's demand for interoperability surges, this will drive users to use the second-tier solution that is already online, because the first-tier interoperability protocol is still under development .
On the other hand, the rollup scheme has received extensive attention from crypto elites including Vitalik Buterin. Rollup is an intelligent protocol based on Ethereum, which can manage cross-layer transmission between the sidechain and the first layer network. Unlike the two-way pegging mechanism used by most sidechains, a rollup contract will package a large number of state changes containing thousands of transactions into a block before publishing it to the base layer. This approach provides a more economical alternative to on-chain data storage and computing.
Rollup is also divided into two types: ZK Rollup and Optimistic Rollup. At a higher level, ZK Rollup uses zk-SNARKS encryption magic to automatically verify every new transaction block. Optimistic Rollup bypasses this mathematical roadblock (SNARKS is costly and difficult to implement), and its operators and users can review the published status and roll back the block if it is invalid. Both of these solutions are still in the early stages of development. Two types of challenges need to be overcome before they become viable options. One is security (rollup consists of a contract, which is a very attractive honeypot for attacks), and the other is implementation. The time required for finality.
The grim reality facing scalability solutions
It's still too early.
Each new technology described above requires time to develop, debug, and improve. The grim reality is that the best technologies that appear in terms of blockchain scalability may not end up laughing. As Dan Zuller said, "There are also social and economic factors that determine who is the ultimate winner (s)."
This view obviously favors blockchain networks like Bitcoin and Ethereum because they have clear advantages in terms of practicality, transaction volume, developer and stakeholder communities. The network effects that control these factors will continue to accumulate value and demand to promote a more robust and mature blockchain. Therefore, extension schemes based on the more popular base layer will be easier to adopt, and then find greater demand.
The remaining question is: where is Ethereum going? If the difficult battle to expand the network has been stuck with delays, will users and projects migrate to other competition chains as a whole? In the short term, it may not. Ethereum has built enough moats through DeFi and diverse stakeholder communities to prevent losing important sites. The emergence of second layer solutions, especially rollup contracts, can alleviate short-term performance concerns of Ethereum and provide sufficient buffering until the mainnet of Ethereum 2.0 is ready.
"High-performance" blockchain projects have not been adopted, which means that scalability is not a decisive feature to attract users. It's just a marketing term, a valid term for raising valuations for pre-product period financing. But it does bring added value to mainstream blockchains; scalability solutions expand the capabilities of existing verticals (such as DeFi), enable applications to address a wider range of use cases, and attract a wider user base.
We will continue to update Blocking; if you have any questions or suggestions, please contact us!
Was this article helpful?
93 out of 132 found this helpful
Related articles
- First Report | 2019 China Government Blockchain Development and Outlook
- Tracking: Where did most of the assets of Plustoken go?
- Opinion: Three major issues to be addressed by DeFi
- Comment: When Ding Crab meets PlusToken, is it possible to invest in metaphysics?
- Babbitt Column | What is Digital Transformation?
- One article goes through: Is the blockchain a database?
- We use these ten sentences to pay tribute to the 2019 Chinese blockchain | Year-end inventory