Decentralized Network Governance: Psychological Analysis of Motivation of Node Cooperation
Decentralized networks are attracting more and more attention, but decentralized systems need to work together to run. If Richard Hendricks' new Internet dream is to come true, it means that we rely on friends, neighbors, and even strangers to provide us with information and connect with the outside world. This will require the super cooperation of the community to make a difference. But can we rely on social networks (some or all of the people on the network are self-interested) to deliver information? You can call me an optimist, but I think humans actually have the necessary "elements" to make them work. Here, I show some of our evolutionary ethical structures and how to adapt them to fit the decentralized network.
Cooperation is a win-win interaction between people, which contrasts with the fact that one person is at the expense of another and thus dominates the transaction (cooperation is not altruism or hitchhiking). One of the most common forms of cooperation is direct reciprocity. "You give me this, I will give you that." Reciprocity is an incredibly strong moral code in humans – Benjamin Franklin once said: "If If you want to make friends, ask someone to help." This is effective because it makes future interactions more likely. People who help you are more likely to find you and ask you to give back. This repetition has kept the social balance sheet open. Sales people and charities have also learned how to effectively apply our ethical reciprocity. Maybe you received a personalized address tag or a cute kitten calendar in the mail, which is a gift with a pre-posted stamp.
This reciprocity creates social capital and helps us all survive. When we return the help of others, or when we work together to accomplish something that cannot be done by ourselves, we will all go ahead. Studies of primates have shown that collaboration is often used to pay for fair resources to encourage future interactions. For example, primates hold hunting competitions together, and monkeys can harvest more meat than other primates participating in the competition. Fair compensation ensures that these primates continue to have partners to hunt together in the future. No monkeys want to include selfish, hoarding monkeys in their hunting activities. And humans have evolved a greater degree of reciprocity.
- Lenovo, Nokia, Anheuser-Busch InBev, etc., have joined IBM's latest blockchain project
- Exploring Zero Knowledge Proof Series | Understanding Zero Knowledge from "Simulation": Parallel Universe and Time Back
- Twitter Pickup: Bitcoin is rising because the Chinese are buying? not at all!
However, as this example may suggest, in order to ensure the survival of oneself (or even the entire species), continuous cooperation is not a foolproof technology. Continuous cooperation is available, it only requires a free-riding person to destroy its effectiveness and deprive the resources of the “eternal collaborator”. Napster's predecessor, Gnutella, was the result of this. Gnutella bypasses the centralized catalog that Napster relies on. When other users make a request, its user base collaborates to upload files online. Unfortunately, only a few free riders can break the partnership. In the Gnutella example, most of their users would request music but would not upload music. For cooperative users, the burden became too heavy and Gnutella collapsed. It has a wrong premise about human nature: we don't always cooperate.
In fact, human beings are not always “cooperators” even in our best interests. Today, game theory uses the prisoner's dilemma to illustrate this problem. Two criminal partners were arrested for minor charges and were interrogated separately. They were told that if they sold their partner on a bigger crime (and the partner remained silent), they were free to leave. If they are silent, they will only get a lighter sentence because of minor allegations. However, if their partner also reveals that they have committed a bigger crime, they will face long-term imprisonment. In the single-wheeler prisoner's dilemma, people tend to stop working with criminal partners, which leads to a bad situation: both prisoners are sentenced to long-term imprisonment.
Perhaps it is clear that if a player is asked to play multiple rounds with the same partner, the behavior of the prisoner's dilemma will change dramatically. In other words, we are more likely to cooperate with each other for the foreseeable future. From an evolutionary perspective, we want to keep our groups happy because our survival depends to a certain extent on them. But if cooperation can be exploited, then what is the right balance?
Mathatologist psychologist Anatol Rapoport developed an algorithm that won the multiplayer dilemma game and largely solved the cooperative strategy problem. His solution follows a familiar principle: a tooth for a tooth. According to these simple rules, the cooperation between the partners is the greatest benefit: (1) the first time to cooperate with strangers. (2) In the subsequent interaction, copy what other players in the previous round did. That's it. If two companions play according to this algorithm, they will always cooperate. If one of the players does not use this algorithm, they will be punished until they change their way. As a result, companions who initially did not cooperate were motivated to cooperate in future interactions.
Bram Cohen created the popular BitTorrent protocol, which uses the principle of tit-for-tat as its core. Gnutella transfers the entire file between nodes, effectively solving the prisoner's dilemma. By splitting the file into small chunks, BitTorrent forces the file transfer process into a repetitive prisoner's dilemma. When a user downloads a target file fragment from a given group, they simultaneously upload the file fragment to the companion who recently downloaded the most data. Unlike Gnutella, BitTorrent successfully coordinated collaborative file transfers and once claimed to manage 43% of Internet traffic.
But the problem of point-to-point connections is not completely solved. If the assumptions on which these "cooperative" algorithms are based are wrong, that is, if there are problems with the repetitive loops, then they are still available. One way this hypothesis may be violated in the P2P algorithm is that when the group is large enough, the free rider can effectively avoid repeated cooperation dilemmas. The principle of reciprocity is not very useful in large groups because you can trade with partners without worrying about losing the “fool” that you can interact with. If everyone follows the first principle of the Atomic Principles, then the free rider can use the initial cooperation and only interact with others once.
In fact, we saw the algorithm response to BitTorrent, by constantly searching for a new "group" to download the next required file fragment (without returning a file fragment), you can download the file without uploading the file. BitThief is an example. The Internet is a huge group. There is no shortage of cooperative "fools" on the Internet. By definition, decentralized networks require a large number of people.
One option is for large groups to form a collective morality that punishes those who hurt others (not just those who hurt themselves). But in order to do this, humans need to be able to find free riders. It turns out that humans are very sensitive to the discovery of scammers – more sensitive than their similar unfair consequences of “accidents”. For example, only 15-month-old babies will be surprised to see that the distribution of goods is unequal (ie, when people do the same level of work get different rewards), but when unfair results are through unconscious behavior (ie, physically constrained It is not surprising that the environment leads to “unfair” returns.
But it was found that the cheaters were not enough to stop them, they had to punish free riders. In fact, people who interact with large groups also gain an advantage by implementing third-party penalties. In other words, human evolution is to punish those who do wrong, even if these wrong things are not directed at themselves. This trend has been confirmed by psychological research: human beings are not only willing to punish third-party bad guys, but also willing to pay personal costs for this, and are willing to correct third-party free-riders, despite these costs.
A unique challenge facing the digital world in identifying and punishing cheaters is reputation management. Individuals in large groups can be protected from third-party penalties if their identity is unknown. Two common online reputation management methods include Sybil attacks (one person creates multiple accounts to increase their online popularity) and whitewashing (one opens a new account to get rid of bad reputation on old accounts). A decentralized identity platform, such as Sovrin or Iris, may make it difficult to recreate new identities to help solve this problem, although any platform is unlikely to be completely immune to these reputation management schemes.
Recently, many people have suggested that in the case of punitive actions against the perpetrators or free riders, the nodes in the centralized network are required to provide some kind of “tokens” that can be “burned” or “reduced”. ". The logic is that if it does cost yourself, people won't hitchhike or don't cooperate. In addition to digital identities, reductions also provide a potential solution for these bad actors, because in theory, the tokens that are reduced may be burned before they have a chance to exit a platform. But such advice needs to be carefully considered, especially for social networks.
Humans tend to retaliate, and the destroyers may in turn burn the innocent people’s tokens and discredit the people who punish them. This may lead to a return to burning money and anti-social behavior – perhaps the opposite of what is needed for a decentralized network. By asking the punishing person to bear the cost of himself, it is possible to curb this burning behavior – even though this advice should be implemented with caution. As mentioned earlier, people don't mind paying a small price to punish the bad guys. On the web, the potential group of punishers is huge, and the punishment from many third parties becomes too harsh and unimaginable. In a sense, the punishment should be partial, which means that the free-rider will only be punished in the relationship between the erring and the punitive, not within the global scope of the agreement.
Another consideration in detecting vandals is to allow for certain errors. In online communication, human beings are very poor at interpreting tone and intention. It is not difficult to imagine that after misunderstanding the other party's intention, a communication error will occur, that is, one party burns the other party's token. Similarly, an unexpected drop in the number of nodes on the network may look like a hitchhiking. For example, if a node participating in the BitTorrent protocol crashes multiple times, an honest node might think that the node is using BitTheif and remove it from the network.
Fortunately, the evolution of ethics has also given us some tips on how to deal with these "accidents" of cooperation and how the former destroyers turned into cooperation: forgiveness. Forgiveness is ubiquitous in human social function, but it is also common among chimpanzees and bonobos. Primatologist Frans DeWall points out that primates often reach out to each other after a fight and provide comfort in the form of a hug. Typically, the indemnity is initiated by a bad participant (or a node that makes this suspicious behavior offline). There are calculated data showing the effectiveness of the forgiveness mechanism: simulation studies have shown that resources are maximized in groups that instill a certain forgiveness procedure.
In fact, as far as I know, forgiveness is the only performance adjustment for a tit-for-tat agreement. It's called "generous toothy" and it's not hard to see why it provides improvements in real, chaotic misunderstandings and hardware failures. Imagine that you are playing a game of repeated prisoner's dilemmas. Things seem to be going well, and you and your partner are following the original principle of dying. Then, suddenly, you are offline. When you go back online to interact with your peers, something interesting happens: they screw you up. What you don't realize is that when you are off the assembly line, they think you screwed them up, so they are rewarding your free-riding behavior. Since you are all dying, you are caught in an infinite loop of uncooperative. Generous autism provides an easy way: if your partner doesn't work, often (but not always) return. This adjustment of forgiveness allows the companion to resume cooperation in the future and thus gain mutual benefit again.
Simply say "not credible." I like this idea, but I doubt that we offer the ability to trust algorithms. Although cryptocurrencies are advertised as trustworthy, they are not (currently) trusted. Aside from the 51% attack, repeated consumption is still a problem for relatively new transactions. In this transaction, a branch can exceed the main chain, so it is recommended to wait for six before delivering a good or service. Blocks." Even if all (many) cryptocurrency attacks have been fixed, these transactions also mean what happens in this less trusting world. This is part of the reason why multi-signature and transaction brokering systems have emerged. In my opinion, these intermediary businesses indicate a problem of trust. Trust is relevant as long as the decentralized network is used to enhance connectivity between people.
Many decentralized network technologies are new, and I am not sure if trust can be achieved. However, I can say that we humans have evolved to use trust as a tool to expand our potential through cooperation. We have gone a long way. Although we are slowly improving our "de-trust" algorithm, I hope that some of the evolutionary moral social tendencies I have outlined can be applied to our algorithms. We humans are very good at navigating the network. After all, we are born to be one.
Source: Hackernoon
Author: Amber Cazzell
Translation: Bitker Institute
Website: https://hackernoon.com/psychology-of-the-dweb-fe61039ob
Disclaimer: This article was compiled by the Bitker Institute. The Bitker Institute focuses on theoretical exploration of the blockchain industry, technology development, and analysis of the secondary market analysis of digital currency.
We will continue to update Blocking; if you have any questions or suggestions, please contact us!
Was this article helpful?
93 out of 132 found this helpful
Related articles
- Halving the spicy strip is just an appetizer, and the third halving of Bitcoin will attract the attention of countless people.
- Imagine 2030 Tucao Conference, Ethereum is hacked, BM, V gods are lying
- Multi-country joint statement, asking the Libra Association to explain how to protect personal data
- The Fed plans to launch an interbank real-time payment system, and the encryption community said: Bitcoin to consider
- Nearly 100% growth in half a year, lightning network breaks down rumors
- Chang Hao: In the next three years, it is the last window of the public chain.
- Why should Wal-Mart also develop digital currency?