BitTorrent founder refutes "Vitalik reviews 16 issues in the crypto industry"
Integration Compilation: Sharing Finance Neo
On November 22nd, Ethereum founder Vitalik Buterin released a post entitled "Hard Problems in Cryptocurrency: Five Years Later" article.
This article is a review of the cryptocurrency ecology published by V God five years ago. The original text mentioned 16 issues, including scalability, timestamps, calculations, code obfuscation, anti-mining machines, PoS, Proof of Storage, stable assets, remaining systems, resistance to witch attacks, successful decentralization, etc.
After a lapse of 5 years, V God once again refers to the old text, examining what developments and shortcomings the blockchain has ushered in in the past 5 years, and analyzes the development status of these 16 issues. In addition, in the text, V God also made new choices for the challenges facing 2019.
- Getting Started | From the three cases of Bitcoin, EOS, and MakerDao, what is a smart contract?
- Comment: Developing an alliance chain does not mean giving up the public chain
- Chinese Computer Society Report: Research Progress on Key Technologies of Blockchain
This article of V God has aroused widespread attention in the circle. For some technical geeks who are concerned about blockchain, their attention to the underlying technology of the crypto world is much higher than the market. However, some people have expressed difficulty in identifying with V God. the opinion of.
Yesterday, Bran Cohen, the author of the "P2P BitTorrent protocol", responded to God V's views on the current state of cryptocurrency technology. It said, "Vitalik's many views on the so-called" cryptocurrency puzzle "are wrong."
It believes that sharding is not the only option to solve the scalability of the blockchain. Bran Cohen stated that "Ethereum's proposal on sharding further breaks this point. It essentially requires miners to own all shards … but in fact this It is no longer a sharding technology, but only further redefines the complete node. "In addition, Bran Cohen also commented on" time stamping "," arbitrary computer proof "," code obfuscation "and so on.
The following is the opinion of V God and related comments by Bran Cohen:
password issue
1.Blockchain scalability
Regarding the scalability problem that the encryption field has been facing, V God believes that scalability is a technical issue, and it has made great progress in theory.
V God further mentioned sharding. The sharding technology is the expansion method chosen by Ethereum. V God praises this technology. God V said that five years ago, few people considered sharding; now, sharding design is commonplace. In addition to Ethereum 2.0, we also have OmniLedger, LazyLedger, Zilliqa, and research papers that seem to come out every month. In my opinion, further progress on this point is gradual. Fundamentally, we already have many technologies that allow validator groups to securely agree on more data that cannot be processed by a single validator. At the same time, these technologies can also allow users to indirectly verify the full validity and availability of a block Even under 51% attack conditions.
And random sampling, fraud proof, custody proof, and data availability proof were hailed by V Shen as "probably the most important technology."
In this regard, Bran Cohe stated that for "blockchain scalability", he (V God) talked about using sharding as the only option for on-chain scaling. Of course, this is not the only option. Payment channel networks are more attractive in many ways and are becoming a reality.
This is a common point, mainly because, frankly, more people (think of them) understand shards than payment channels, and people like to think they are smart, so they advocate solutions that they understand.
The truth about payment channels is that they fundamentally weaken the security of the system because you trust a subset of all peers to guarantee the integrity of each shard, and the exchange condition is a very small amount of scalability Improve. How small? Ok…
When you calculate numbers, a factor of 3 is perfectly feasible, a factor of 10 is unreasonable, and a factor of 100 is a complete joke. Even in the best case, this complexity brings few benefits.
Ethereum's proposal on sharding further breaks this point. It essentially requires miners to have all the shards. This is … not sharding, it just further redefines the "full node" to represent fewer than before .
Value cannot move automatically between fragments, which may be inevitable because of the chaos of EVM semantics, but it creates a huge incompatibility in exchange for a very small benefit.
2.Time stamp
God V rated the progress of the timestamp as "some progress."
God of V said that the recent network adjustment timestamp suggestion attempts to improve the status quo by allowing clients to determine consensus on time without the client knowing the current time with high accuracy locally; however, this has not been tested. In general, timestamps are not the focus of current research challenges. Maybe once the PoS chain (including Ethereum 2.0 and others) comes online as a real-time system , we will see the importance of the problem.
The problem with timestamps is that the clocks of all legitimate users are in a normal distribution of some "real-time" time, with a standard deviation of 20 seconds, and the interval between two nodes does not exceed 20 seconds. This solution allows to rely on existing concepts of "N nodes"; in fact, this will be implemented through proof of stake or non-Sybil tokens. The system should continuously provide the time, which should be within 120s (or shorter if possible) of the internal clock of> 99% participating honest nodes. External systems may ultimately depend on this system; therefore, regardless of motivation, it should remain secure to ensure that the attacker controls no more than 25% of the nodes.
In this regard, Bran Cohe believes that the question about "time stamping" is … is it working? It is strange to claim that it is accurate to within 20 seconds. Bitcoin can already be implemented based on block height or block timestamp.
To prove the value of space and time, you can also choose the entire VDF operation, which in some cases is much more accurate.
3. Arbitrary calculation proof
God V said that arbitrary calculations proved to have significant theoretical and practical progress. This basically says to build a SNARK (or STARK, SHARK, and other names). SNARK is now understood by more and more people, and even used in multiple blockchains (including tornado.cash on Ethereum). SNARK is very useful, both as a privacy technology and as a scalability technology.
However, the problem with arbitrary calculations proves that there are still challenges in terms of efficiency. Not only that, efficient proof of random memory access is another challenge. In addition, there is an unresolved question, whether the increase in time with O (n log n) is a basic limit , or whether there is some kind of method that uses only linear overhead for concise proofs, similar to bulletproof proofs (unfortunately Yes, it takes linear time for verification). There are risks in the existing scheme. Generally, the problem lies in the details rather than the foundation.
In response, Bran Cohe also said that there are many exciting things about "arbitrary calculation proofs". The change is so rapid and exciting that in the next few years, I dare not study it seriously until everything settles down. But in the end it will be amazing.
4.Code obfuscation
Compared to the rapid development of the first few technologies in 5 years, the performance of code obfuscation is a bit slow.
God V said that the code obfuscation solution is extremely useful for blockchain protocols, and its application scenarios are very delicate. Because it is necessary to deal with the possibility that the obfuscated program on the chain is copied and run in another environment different from the chain itself, there are many other situations. One application scenario that is of great personal interest is to use obfuscated programs to replace operations that originally contained some proof of work , so that centralized operations can be deleted in anti-collision gadgets, making attempts to run with different inputs The operation of multiple runs to determine the private behavior of participants is very expensive.
Unfortunately, this is a very difficult problem, and much work remains to be done on the way to solving it. On the one hand, we are constructing to reduce the number of hypotheses of mathematical objects that we don't know if we actually don't know (such as multi-line cryptographic multi-line mapping). On the other hand, we try to implement the actual implementation of the required mathematical objects. However, all of these paths are far from creating viable and known security.
And Bran Cohe said that for "code obfuscation", I don't know why you (V god) want that. ZK technology realizes almost all use cases in practice. If there is a protocol that can turn any trusted setting into an untrusted setting, it would be amazing, but it is still a theoretical daydream .
Although the recent improvements have changed from 2 ^ (10 ^ 4) to 2 ^ (10 ^ 3), this is an improvement.
5.Hash-based cryptography
Since 2014, two major advances have been made in this regard. SPHINCS is a "stateless" signature scheme (meaning that multiple uses do not need to remember information like a random number), it was released shortly after the release of this "difficulty" list, and offers a size of about 41 kB Pure hash-based signature scheme. In addition, STARK has also been developed to create similar-sized signatures based on them. I (V God) did not think five years ago that hashing can be used not only for signatures but also for zero-knowledge proofs for general purposes. I am very happy about this situation. This means that size is still an issue, and continuous progress is continuing to reduce the size of the proof, although this seems to be progressing slowly.
The main problem that has not been solved with hash-based encryption technology is aggregate signatures, similar to what BLS aggregation does. As we all know, we can STARK many Lamport signatures, but this is inefficient and a more efficient solution will be welcome.
Bran Cohe believes that for "hash-based cryptography", if we assume that this mainly refers to hash-based signatures, then we have a good understanding of their role and their limitations. There is no reason to believe that there will be Meaningful improvement .
Consensus theory problem
6. Proof-of-work against ASIC (PoW)
In terms of anti-ASIC proof of work algorithm, Ethereum uses Ethash. God V said that Ethash has proved very successful in resisting ASICs. After three years and billions of dollars in block rewards, ASICs do exist, but their functions and costs are at best 2-5 times higher than GPUs. ProgPoW has been proposed as an alternative, but there is growing consensus that ASIC-resistant algorithms will inevitably have a limited life cycle, and ASIC resistance has disadvantages because it makes 51% of attacks cheaper
In addition, V God also believes that I believe that it is possible to create PoW algorithms that provide intermediate-level resistance to ASIC, but this resistance is limited, and both ASIC and non-ASIC PoW have disadvantages. In the long run, a better choice for blockchain consensus is proof of equity. (As we all know, Ethereum will eventually move from POW to POS).
In this regard, Bran Cohe and V God have great differences. It argues that proof of work against ASICs is both a daydream and a bad idea. ASIC-friendly, making hardware more commoditized is a better idea, because when ASIC resistance inevitably fails, it will only generate more concentration during the manufacturing process .
(Note: Ethash is called a hard memory algorithm and aims to achieve ASIC resistance by making memory access an important part of running PoW calculations.)
FYI, if you want to be ASIC friendly, the best way is to iterate 100 times using SHA-3. If you really want to do thermonuclear, you can use space and time evidence, as long as someone is studying the technology …
It states that "useful proof of work" is even more severe than ASIC resistance, and the biggest attack vector is not even mentioned here, that is, if the problem is suggested by the user, then the attacker can put the problem where they happen to have a good solution local.
7. Useful Proof of Work (POW)
God V believes that the proof-of-work algorithm requires many attributes: 1. It is difficult to calculate and easy to prove; 2. It does not rely on a large amount of external data; 3. It can calculate efficiently in blocks.
It states that there are not many useful calculations that can retain all of these attributes, and most have all these attributes and "useful" calculations are just "useful" for too short a time to build a cryptocurrency based on them.
However, there is one possible exception: zero-knowledge proofs. Zero-knowledge proofs on blockchain (for example, the availability of data for a simple example) are difficult to calculate and easy to verify. In addition, they are difficult to calculate. If the proof of "highly structured" computing becomes too easy, you can simply switch to verifying changes in the state of the entire blockchain, which becomes very expensive because you need to model virtual machines and random memory accesses.
8. Proof of equity (POS)
As the consensus mechanism of Ethereum's final choice, V God said that before the end of 2014, the evidence of the equity community clearly showed that some form of "weak subjectivity" was inevitable. In order to maintain economic security, the node needs to obtain the nearest checkpoint protocol when it is synchronized for the first time. If the node is offline for more than several months, it needs to obtain it again.
This is a big risk. Many PoW advocates still insist on using PoW, because in the PoW chain, the "head" of the chain can be found as the only data from a trusted source, namely the blockchain client software itself. However, PoS advocates are willing to take this risk, because the increased trust requirements are not large, and the way to prove equity through long-term margin becomes clear.
God V said that Ethereum 2.0 (the chain that will implement FFG) is currently being implemented and has made great progress. In addition, Tendermint has been running as a Cosmos chain for several months. I think the rest of the argument about proof of equity is related to optimizing incentives and further standardizing strategies to deal with 51% attacks. In addition, the Casper CBC specification can still be used as a specific efficiency improvement.
Bran Cohe believes that proof of stake is still a bad idea. It started with a fundamental weakening of the security model, and based on that it encountered a series of deep technical problems. We've made some progress, but it's more about making a decent BFT than a true proof of stake.
9.Storage certificate
Currently, there are many planned blockchains using storage protocol proofs, including Chia and Filecoin.
The V god believes that although the storage proof has made some theoretical progress, it still lacks some practical evaluations. The main concern of V God is whether these algorithms are actually dominated by smaller users using spare storage capacity or are they dominated by large mining farms?
In this regard, Bran Cohe said, this means that it must exist for several years, during which time all existing hardware may be scrapped and recycled, and the technology stacks that people use today may have been completely replaced, The stored relative supply-demand relationship may have changed too much, with the old things being deleted to make way for the new.
No amount of error correction codes can help store duplicate copies. In fact, redundancy makes them worse by reducing overall capacity and increasing costs.
If you have a group of people in a room, the chance of all of them dying after one year decreases exponentially as the number of people increases. There will not be so many in 500 years from now.
economics
10.Stable coins
In terms of stablecoins, God V mentioned Maker DAO. Maker DAO now issues more than $ 100 million in synthetic stable token DAI. It has become the backbone of the Ethereum ecosystem, and many Ethereum projects have been or are being integrated with it. Other synthetic token projects, such as UMA, are also rapidly developing.
God V believes that the risks faced by Maker DAO lie in token decline and oracles.
In the past, Bitcoin has fallen by 75% in two days; the same situation could one day happen to Ethereum or any other mortgage asset. At the same time, a malicious attack on the bottom of the blockchain is a greater untested risk, and the price drop that this risk is expected to exacerbate the risk itself; another potentially larger challenge is similar to The stability of MakerDAO's system depends on closed oracles. At present, there are indeed different attempts at oracles, but the question of whether they can withstand them under huge economic pressure is inconclusive.
So far, the collateral controlled by MakerDAO is lower than the value of MKR tokens; if this relationship reverses, then MKR holders have an incentive to collectively attempt to "loot" the MakerDAO system. There are multiple ways to prevent such attacks, but they have not been tested in the real world.
In response, Bran Cohe stated that because "stabilizing value of crypto assets" and stablecoins are the same thing, their problems are related to the legal and financial situation of the supporting entities. A stablecoin without a backend is not and will never be a thing, for obvious reasons and not worth explaining.
Not only that, he also believes that V's other views in the column of "Economics" are all very wrong. The whole point of the Satoshi consensus is to avoid human rule. Governance is going backwards, not forwards.
11.Decentralized public goods incentives
In terms of incentive mechanism, V God believes that there is no major breakthrough in this field.
There are two types of potential solutions it proposes: 1. Try to draw personal contributions to provide people with social rewards (such as charity through marginal price discrimination); 2. Collect funds from applications with network effects.
The specific scheme of the above 2 may be to issue tokens.
By issuing tokens, collecting transaction fees, leasing fees, or other fees (including tax or organizational fees) to achieve the effect of collecting funds from applications with network effects.
In response, Bran Cohe said that almost all the suggestions on how to manage have come out of the dark age. Here's an idea: we have a "company", a "shareholder", a "business model" that generates "profit" and owns a part of it.
Ongoing operations include development and maintenance of open source software. Obscure examples like Redhat, IBM, Microsoft, Google, and Facebook have been known to do so in large numbers, maybe you have heard of them?
In addition to the above points, V also mentioned the initial problems that were almost forgotten in these 5 years. Including: 12, reputation system (slow progress); 13, excellent workload / contribution proof (token distribution mechanism, no progress, almost forgotten) .
The following are the last 3 questions asked by God V, and Bran Cohe did not comment on the next questions:
14.Decentralized contribution
Unfortunately, stimulating the output of public goods is not the only problem with centralized solutions. Another problem that centralization solves is: first of all, it is necessary to determine which public goods should be output, and then determine how much work is required to complete the output of public goods. This challenge involves the latter issue.
Status: Some progress, focus changes
Recent progress in determining the value of the contribution of public goods in recent years has not separated 1. determining tasks from 2. determining completion. The reason is that in practice they are difficult to separate . The work done by some teams is often irreplaceable and subjective, so the most reasonable approach is to consider the relevance of tasks and performance quality as a whole and evaluate them using the same techniques.
Fortunately, much progress has been made in this regard, especially after the discovery of "secondary funding" . "Second funding" is a mechanism under which individuals can contribute to projects. With the perfect coordination of the donors, based on the number of donors and the number of donations, a formula is used to calculate the donation amount. (Perfect reduction here refers to: taking into account the interests of each donor, and will not lead to a collective tragedy of all donations). When donating to the project, the difference between the amount that should have been donated and the amount actually donated will be provided to the project party as a subsidy from a central funding pool. Note that this mechanism focuses on meeting the value of some communities, not meeting certain given goals, whether or not someone cares about it. Due to the complexity of the value problem, this method may be more robust to unknown unknowns.
In reality, the secondary financing mechanism has achieved considerable success in the recent gitcoin secondary financing. Some progress has also been made in improving secondary financing mechanisms and similar mechanisms; for example, paired and bounded secondary financing can reduce collusion and collusion. People have also done a lot of work on the standardization and implementation of anti-bribery voting technologies to prevent users from proving to third parties who they voted for; this prevents multiple collusion, collusion and bribery attacks.
15.Anti-witch attack system
This issue is a bit related to reputation systems, and it is a challenge to create a "unique identity system." An anti-witch attack system is a system that generates a token that proves that it is not part of a witch attack. However, we want a better and more equal system than "one dollar, one vote"; it can be said that one person, one vote will be the ideal choice.
Status: Some progress. Many attempts have been made to solve human unique problems
With increasing interest in technologies such as secondary voting and secondary financing, the need for some kind of human-based anti-witch system is growing. It is hoped that the continuous development of these technologies and new technologies can meet these needs.
16.Distributed reality measurement
Problem: Propose and implement a distributed method to measure real-world numerical variables. The system should be able to measure any numerical property (e.g., asset price, temperature, global carbon dioxide concentration) that humans can currently reach a general consensus on.
Status: Some progress
It is now commonly referred to as the "oracle machine problem." The largest known instance of a distributed oracle running is Augur, which has processed multi-million dollar betting results; token-managed registries (such as Kleros TCR) are another example. However, these systems still don't see real-world testing under the fork mechanism, either out of a highly controversial issue or out of an attempted 51% attack. There is also research on oracle problems that occur outside the blockchain space in the form of "peer prediction"; see here (https://arxiv.org/abs/1911.00272) for the latest developments in this field.
Another imminent challenge is that people want to rely on these oracle systems to guide the transfer of assets, the amount of which is greater than the economic value of the system tokens. In this case, token holders are theoretically motivated to conspire to provide the wrong answer to steal funds. In this case, the system will fork, and the tokens of the original system may become worthless, but the original system token holders can still get a return from any asset transfer they misled. The stablecoin is a particularly bad example. One way to solve this problem is to build a system that assumes that altruistic and honest data providers do exist and creates a mechanism to identify them and only allow them to run slowly so that if malicious data providers start to rely on Votes are obtained in the oracle machine's system, so users of systems that rely on the oracle machine can first complete an orderly exit. In any case, the further development of oracle technology is a very important issue.
List of new issues
If I were to write a list of challenges again in 2019, the above issues would continue, but the focus would change significantly, and new major issues would also emerge. Here are some highlights:
- Crypto obfuscation
- Anti-collusion infrastructure
- Oracle
- Homomorphic encryption and multiparty computation
- Practicality still needs to be improved.
- Decentralized governance mechanism
- Fully formalized response to PoS 51% attack
- More sources of funding for public goods
- Reputation system
In the end, V God said that usually, the problems at the base layer will slowly and continuously decrease, but the problems at the application layer are just beginning.
Link to the original article of God V:
https://vitalik.ca/general/2019/11/22/progress.html
Bran Cohe comment on the original link:
https://twitter.com/bramcohen/status/1198787471175106560
We will continue to update Blocking; if you have any questions or suggestions, please contact us!
Was this article helpful?
93 out of 132 found this helpful
Related articles
- Babbitt Original | Microsoft, Baidu successively layout, is DID an inevitable choice for the digital process?
- Yao Qian: What is the difference between the new financial market infrastructure based on blockchain and traditional FMI?
- South Korea is enacting a cryptocurrency bill, virtual currencies will be classified as digital assets
- U.S. mutual funds enter the blockchain in a big way
- Blockstream CEO: Bitcoin is paramount, stablecoin and central bank digital currencies cannot match it
- Behind the difficulty of recruiting on the blockchain: the number of job seekers reaches 7 times the recruitment demand
- Blockchain emerges like a tiger? Thoughts on the Fourth Wealth Mania