Popular Science | Bandwidth and Blockchain: How Developers Minimize Overhead
The blockchain records the entire world on the same ledger. Each time a new block is dug, the ledger will generate a new state to replace the previous state. The consensus mechanism is designed to ensure that this status is recognized by most people in the community. In a cleverly designed system, the incentive mechanism ensures non-variability. As long as you wait long enough, the state that has become history cannot be tampered with. The blockchain brought us programmable currency and captured the imagination of many people.
Bitcoin and Ethereum are the two leaders in the blockchain world. It turns out that these two book-based technologies are highly popular and have strong development momentum, but people are generally not optimistic about their scalability. Why does this happen? If you improve this situation?
background
The blockchain itself is also known as Layer 1. The blockchain acts as a global source of trust for all digital asset ownership in the network. The full node in the network tracks the current state of the ledger. To maintain this decentralized or powerless state, small miners must also be able to participate in validation and contribute to the blockchain. The system resources and bandwidth in the hands of small miners are far less than the big miners.
To achieve trust-free and anti-censorship, blockchains cannot be controlled by an entity or small group. Developers believe that people with different resources can participate, thus avoiding the issue of power concentration and single point of control. Therefore, the block size limit of Bitcoin is not improved. This ceiling will inhibit the expansion of the system's resource requirements and allow more people to participate.
- Trump slams Bitcoin and Libra; the US Congress intends to ban technology giants from digital assets
- "No brain black" or "Frenzy powder"? Check the national politicians who are talking about cryptocurrency and blockchain
- Blockchain Weekly | US SEC issues Trump to question cryptocurrency through two compliant tokens
It should be noted that the parties can also transfer funds through local agreements without the need for global updates. This is what we usually call a two-tier scheme or a chain of transactions , which has a profound impact on the scale of the expansion of the transaction (although it does not necessarily expand the user base). The idea is that transactions between parties can be aggregated before they are chained. In some designs, this local protocol can take several months or more from the time it is turned on to the state.
Engineers are very optimistic about the two-tier solution, and believe that it can greatly enhance the scalability of the blockchain, which is decisive for the cryptocurrency can meet the needs of large user groups. However, while a Layer 2 solution has a lot of impact, all it can do is increase the throughput of the underlying blockchain, rather than bringing infinite scalability. Therefore, the focus is on optimizing the network to minimize the resources required for each transaction.
problem
The decentralized Bitcoin blockchain is a globally shared broadcast medium – most likely humans have designed the least efficient means of communication.
— Greg Maxwell
The basic problem with a layer of blockchain is that copies of the entire network must be updated synchronously with the general ledger. Each full node stores a copy of the blockchain, each copy being identical and independently verified by each full node.
Each block that is newly dug must be broadcast to other nodes. Delays are especially critical for mining nodes because they need to continue to dig next to the newly dug blocks, otherwise it will be wasted. In order to verify the latest excavated blocks, the miner needs to know which transactions are included in the block and have a copy of all the transactions.
For now, Bitcoin software uses a naive approach to broadcast transactions and blocks. Nodes broadcast them to other nodes after they receive the transaction. The created block will be widely spread regardless of whether the recipient has mastered some of the data segments. For miners who need to follow up on the block in real time, such an inefficient process can't be tolerated.
Remedy
The inefficiency of Bitcoin is well known. Over the years, efforts to improve network efficiency have been reducing bandwidth costs and speeding up communications. Former Bitcoin maintainer Gavin Andresen released an O(1) block propagation roadmap in 2014. Later, the road map was greatly revised, but the basic ideas did not change.
-FIBRE network diagram (source: bitcoinfibre.org )-
Relay network
Miners are already actively implementing block relay networks such as FIBRE and Falcon . These trunking networks use a low-latency + high-bandwidth connection, but there are some drawbacks, such as a high degree of centralization, a large amount of bandwidth for minimizing delays, and so on. These networks do not reduce the amount of bandwidth required to run non-mining nodes, which is necessary before engineers can scale their systems.
Compressed block
If the transaction contained in the block is already in the receiver when the block is transferred, it is not optimal. Cryptologist Greg Maxwell has pointed out that most of the transactions in the newly dug block are known to the recipient. To illustrate the white point, there are many transactions in the newly dug block that are likely to be in the recipient's memory pool (a collection of received but unconfirmed transactions). Greg Maxwell studied the issue and drafted a proposal in December 2015.
Core developer Matt Corallo improved the proposal based on Greg's research and formally proposed BIP152 in early 2016 to clarify the concept of compressed blocks. The compressed block protocol refers to the block data sent by the sender to the receiver that is no longer complete, but the block header, transaction ID abbreviation and transaction set that the receiver definitely does not have. This change is designed to save bandwidth, but it also has the effect of reducing latency.
Erlay
Bitcoin uses the “big water flooding” type of block broadcasting, which is obviously not the best way to spread transactions in the network. A node may receive multiple copies of the same transaction and must broadcast those copies to all nodes connected to it. As a result, a lot of bandwidth is consumed, causing unnecessary expenses. In addition, this will dampen the enthusiasm of the node to establish a connection with more nodes, because more than one node is connected, the bandwidth required for each transaction is increased accordingly.
The fewer nodes that are connected, the lower the bandwidth consumed. However, this can pose certain dangers, making users more susceptible to eclipse attacks. Ethan Heilman , the founder of TumbleBit , published a research paper on solar eclipse attacks in 2015. The basic idea of this paper is related to the witch attack. The attacker disguised as a number of different entities to confuse the victim. The attacker will monopolize all external connections to the attacked node. As the only source of information for the attacked node, the attacker will forge the current state by creating special blocks. The cost of such an attack is high, but if the attacker can convince the attacker that he has received a large amount of bitcoin, he actually transfers the money to another place, and the attacker may be deceived, thinking that I have already received the money and are willing to ship.
Erlay can achieve a better and more powerful network. If the network is connected to each of the other 32 nodes, Erlay's researchers found that after optimization, the bandwidth used by the node is 75% less than the bandwidth used by the current software.
How to go next?
On the one hand, the mining node uses the relay network, and on the other hand, the ordinary node users use the compressed block protocol, and the block broadcast can be greatly optimized. Researchers are still working to reduce system spending and streamline the process as much as possible. Talking about improving throughput is only on the agenda when developers think that bandwidth consumption is low enough for current throughput.
Expanding data
- Greg Maxwell 's presentation on improving block broadcasting in 2017
Original link: https://medium.com/scalar-capital/bandwidth-and-the-blockchain-2ad35c57dbdf Author: Jordan Clifford translation & proofreading: Min Min & A sword
This article is authored by the author to translate and republish EthFans.
(This article is from the EthFans of Ethereum fans, and it is strictly forbidden to reprint without the permission of the author.
We will continue to update Blocking; if you have any questions or suggestions, please contact us!
Was this article helpful?
93 out of 132 found this helpful
Related articles
- Read the question that Libra, Facebook will face, will face problems at the US FSC hearing on Wednesday.
- Is the power of the mine pool too big? Bitcoin should probably use the BetterHash protocol.
- Despising Bitcoin and gold, where is Buffett wrong?
- Bitcoin is halving, is it a chance to make a fortune, or a disaster
- Libra Senate and the House hearings preview: the situation in the Senate is acceptable, the House of Representatives is more dangerous
- Suning Financial Research Institute: Libra Ruocheng, Bitcoin will be extinguished
- BCH and ETC are married to ETH? V God said that they can be used as ETH data layer in a short period of time.