How to understand the layer 2 data availability solution ZK Rollup?
Author's Note: ZK Rollup is not a new proposal. It was proposed by Barry Whitehat about a year ago. At the same time, Vitalik has a relatively complete article in the Ethereum Researcher's forum, which is now being developed by Matter Lab. After studying zk-SNARKs, I haven’t had time to look at it until recently. In addition to the ZK Rollup, it's easy to take a look at the Optimistic Rollup that was introduced at the Plasma Group.
When ZK Rollup was first proposed, it was defined as a layer 2 solution. At the beginning of the year, it was published under the name Plasma Ignis. It should be because Plasma was very popular last year, and there have been new proposals and progress, and this was also defined as the layer 2 solution. For these reasons, developers have given the name of Plasma, but because of this technology. It was completely different from the spirit of Plasma. After being protested by the community, it was later restored to the name Rollup (the developer's statement), so searching for 'Plasma Ignis' would not find anything. Recently, Rollup was renamed semi-layer 2 solution, which is a little layer 2 but not so layer 2… XD
A simple explanation of ZK Rollup is the layer 2 solution where data is placed on the chain .
Before you understand ZK Rollup, let me explain what is wrong with the original layer 2. Taking Plasma as an example, the Plasma chain only puts the hash of the Plasma block on the Ethereum main chain for fairness (for more information on Plasma), that is, trading hundreds or thousands of transactions under the chain, and finally winding. There are only a few dozen bytes, which is the spirit of the chain trading, but it is also the most troublesome place in the design – the availability of data (Editor's note: the mainland is generally translated as "data availability").
- 6 pictures tell you about the development status of the Asian cryptocurrency market
- Attract more people to buy Bitcoin, is it useful to change units?
- What are the risks of encrypting asset exchanges?
That is, when someone wants to leave the chain, an additional game rule is needed. In Plasma, it is called the challenge period (because there is no data on the chain, it needs the evidence provided by the side chain participants), which has data to challenge, so everyone has to Save a certain amount of data, compared to the interaction with the main chain, only need to install a wallet, and do not need to download block data, the user experience is very different. Another problem during the challenge period is that the user needs to stay in the online state, or miss the challenge period, which means the default transaction (because the fraud is not a proof of validity). In short, because of the availability of data, it has been derived.
- Users need to be online often
- Need to download some data
And the user experience is very bad (of course, the Plasma design has been improved a lot)
How to put data on the chain without causing the data to be too large?
First, let's first introduce the overall architecture. Like Plasma, there is a smart contract to guarantee that there is a repeater to help send the transaction to the smart contract (called the operator in Plasma). In addition to sending the transaction, the repeater needs to generate the SNARK certificate and send it together. verification.
The part of the smart contract can be imagined as the ERC20, the account of each participant is recorded in the contract, the difference is that the standard ERC20 transaction is verified by the Ethereum system, and therefore cannot be combined (because this is the standard transaction of Ethereum) In Rollup, several transactions are packaged into one standard transaction. For the Ethereum system, it is a transaction, and the validity of the verification transaction is verified by the smart contract.
Actually, in the smart contract, two merkle trees are used for the record. A tree is the record address, so only the index value of the tree can represent an address (the content of the unregistered index value is 0), so the data amount of the address is From the original 20 bytes to only 3 bytes, the other tree records balance and nonce.
– Address of the Merkel tree –
This is the data format (this is the initial proposal, and later the actual transaction volume is smaller),
Because the index value is used as the representative of the address, only 3 bytes (224 addresses) are needed, and the part of Value is based on 10^-6, so that only 15 bytes can represent a transaction, and such a transaction is stored. The transaction only needs 892 gas (although Value is 6 bytes, but the article assumes that most transactions will only use 4 bytes, so the algorithm is 13 bytes * 68 + 2 bytes * 4 = 892), and the general ether transfer A 21K gas is required, so the transaction speed can be increased (so Vitalik's article title is "On-chain scaling to potentially ~500 tx/sec through mass tx validation").
– https://vitalik.ca/general/2019/08/28/hybrid_layer_2.html –
Why can the transaction speed increase? Also by the way to understand the transaction speed
The current gas limit for each block in Ethereum is about 8M, so if you simply trade ether, the speed is about
8M / 21K / 15 ~= 25 tps
So the current trading bottleneck is actually a problem with gas. If you lower the transaction fee (Editor's note: the amount of Gas used to reduce the transfer) or increase the upper limit of the block, you can get a good time (but it will also cause The problem of stretching is that ZK Rollup can increase the transaction speed by reducing the amount of transaction data. Let’s see how fast the transaction can be after using ZK Rollup.
(8M — 600K (zk-SNARK verification) — 50K (cost of estimated contracted gas)) / 892 / 15 ~= 550 tps
This number is the header of the Vitalik article "On-chain scaling to potentially ~ 500 tx/sec ". But it's actually not that ideal. In the implementation of the author Barry, it's only about 268 tps, because every time an asset is updated, it will leave an event, so there is extra gas cost. However, such a design is more intimate in application. of.
The data is on the chain and verified by zk-SNARK, which means that the data on the chain is verified, so there is no problem with layer 2, challenges, data downloads, etc. This also implies that there is no need to trust repeaters, because they can't do bad, at most they don't send you a deal.
Things are not so beautiful…
Everyone thinks that zk-SNARK is like a panacea, and it seems that everything has been solved, but it is actually not so perfect. zk-SNARK In addition to the initial settings (editor note: refers to the initial settings that need to be trusted), the biggest problem is that it requires a lot of computing power. In the data provided by Barry, the repeater's computer is an 8G memory plus The 20G hard drive swap can only produce 20 tx/sec, which is far less than the expected 500tps or more than 200 tps. So the biggest problem with this solution is how to solve the calculation problem.
Parallel operation!
Matter Lab uses a multi-repeater model and parallel operations. Multi-repeater models, much like small blockchains, use DPOS (Delegated Proof of Stake), and randomly pick block producers, so the selected block producers can collect transactions and generate Prove and go on the chain. This method avoids centralization. If the relayer is maliciously attacked, the entire network can still operate. On the other hand, it also paves the way for parallel computing. The generation of zero-knowledge proof is very time-consuming. Therefore, based on the multi-relay model, Matter Lab proposed a two-stage “wind-up-verification” method, that is, the repeater first puts the data on the link, and the next stage uploads the proof. Verification, and then parallel operation (as shown below). Coupled with some data optimization, the test results can reach 1600 tps.
-https://medium.com/matter-labs/introducing-matter-testnet-502fab5a6f17-
delay…
It sounds wonderful, but because your trade is split in two stages, that is, from being sent to being verified, it will be several blocks, and the time will be longer than the original simple winding time. Of course, how long the delay is acceptable to the user is not known at the moment. This is a trade-off, saving the handling fee, increasing the transaction speed, but also increasing the time delay. All this will not be known until the line is online.
At the beginning of this year, Vitalik shared the advanced version of ZK Rollup – ZK ZK Rollup at the offline party in Taipei. Those who are interested can refer to this article and record it in great detail.
Plasma & Optimistic Rollup
Optimistic Rollup is designed to be related to Plasma, so it's only a simple difference.
Karl (Note) based on the design of ZK Rollup, proposed Optimistic Rollup last month, conceptually also put the data on the chain, but not using zk-SNARK for verification, because I hope to achieve more universal applications. In the different places, the part from is changed to the user's signature (65 bytes). Because the amount of data becomes larger, it is conceivable that the flower will be more gas and the transaction speed will be less than ZK Rollup. The other part is that because you don't use zk-SNARK for verification, you need a validation game for data validation. I won't go into detail here. I have the opportunity to write a detailed introduction to Plasma/Optimistic Rollup.
In terms of estimation, the transaction speed is about 100 tps. If the signature method is changed to BLS, it can be increased to 450 tps. After the hard split in October, gas will drop and the estimated trading speed will reach 400/2000 tps. ( Wishing: I hope someone can introduce the hard-dotted details of XD in October)
Note: In Chinese media articles, he is said to be one of Casper's core researchers, but from the very beginning I knew that this person is vigorously promoting Plasma, his blog, twitter are related to Plasma. Not sure about his role in the Plasma Group, but I'm positioning him as the leader of the Plasma Group.
If there is any error or different opinions in the content of the article, please advise.
References:
On-chain scaling to potentially ~500 tx/sec through mass tx validation
Introducing Matter Testnet
Optimistic Rollup
(This article is from the EthFans of Ethereum fans, and it is strictly forbidden to reprint without the permission of the author.
We will continue to update Blocking; if you have any questions or suggestions, please contact us!
Was this article helpful?
93 out of 132 found this helpful
Related articles
- Zhang Zhenxin's pioneering department: I think that the blockchain is a life-saving straw, but it is overwhelmed by it.
- Bitcoin and cryptocurrency continue to incite the banking system, and it is expected that Bank of America will lay off 200,000 people in the next 10 years.
- Market Analysis: BTC once again broke the 8000 line, the battle of the bears took the upper hand
- Opinion: If there is a crisis in corporate credit, 2% of the funds may flow to Bitcoin
- Payment System Architecture Based on Wholesale Digital Currency (W-CBDC): Interpretation of Fnality White Paper (I)
- Is Bitcoin a currency or an asset? It’s time to end this problem.
- In-depth analysis of FairWin: the money disk of the knife-edge blood loss makes the average loss of 10129 players 47%