Web3.0's Business Model Transformation and General Economic Investment Logic

The business model transformation since the birth of the computer, from hardware sales to the sale of licensed software usage rights to the data collection after the open source, business giants from IBM to Microsoft to Facebook have gone through more than one time, and the future business The logic will be based more on three aspects: edge computing, machine learning, and distributed network architecture. The resulting new forms of participation and investment logic are worth thinking about every current network participant.

1. Platform architecture and transformation of business logic

1.1 Architecture History

We are at the beginning of a paradigm shift in software architecture: the first wave of decentralized data network movement.

The movement we have witnessed in the past few years has inadvertently surpassed Bitcoin and other encryption assets, even open source software and blockchain. From a broader perspective, it is based on an open peer-to-peer data network that reflects the power of appropriately tuned economic incentives and begins to take advantage of previously overlooked personal data centers in everyone's lives. With the popularity of high-speed wireless broadband, the rapid maturity of cloud-native software and the recent surge in machine learning technology, this movement is bringing far-reaching impact.

Over the past few decades, advances in technology architecture have commercialized operating systems and software packages that enable them to be accessed globally through data centers and cloud infrastructure. In this wave of new wave, the data center is expanding to the edge of the network, the data itself is "open source", and the commercial elements are transformed into reusable and reliable "blocks." Distributed users and machines interact with the data through the base of the peer-to-peer network. These peer-to-peer data networks become a "structure" that verifies and manages information input without the need for third parties, while providing individual users with their own data in an available, secure, and scalable manner.

Although Google has quietly given up the motto of “doing no evil”, we are more interested in the new software architecture, whose motto is “can’t do evil”.

1.2 Personal sovereignty

By giving users their own data sovereignty, the distributed data network wave is breaking through the huge data islands that many companies regard as the lifeline. Affected by Equifax data breaches, or the impact of Cambridge analysts' personal data on 87 million Facebook accounts and the recent leak of another 500,000 user logins, users are tired of entrusting their personal data to these centralized silos. Although Google quietly gave up its motto of " doing no evil ", in the new network architecture, its motto is "can't do evil": all users control their data locally, thus promoting the rise of individual sovereignty.

As Yuval Harari said at the recent TED conference, totalitarianism is not so obvious as it is often described; rather, it is an attractive and simple solution that solves the problem that may seem stupid at the moment. . Harari believes that the concentration of data in certain commercial or government organizations may induce them to master and personalize the technology to the extent that is unimaginable to date. And we need new organizational structures, data architectures, incentive architectures and technical architecture to offset this danger.

1.3 Transformation to people-centered computing

While the trust layer has become a commonality between individuals and communities, it turns out that creating it in humans and hardware devices and software application transactions is difficult. As the potential of software to serve us in an increasingly intimate and personalized way continues to grow, so does the need to provide a trustworthy relationship. When a person shares personal genetic and physiological information, it relies on an algorithm that instantaneously weighs when the human is at a dangerous moment and even filters the news stream that forms the basis of the facts of our daily decisions.

In "Sapiens", Yuval Harari also explores the view that from a historical perspective, we have the ability to conceptualize abstract concepts through language and build on the common sharing of strangers to cooperate or community rise. belief. Technology now has the ability to abstract this trust by generating the correct incentives at the protocol layer to allow for cooperation and trade on a global scale.

The upcoming wave of decentralized data networks will shift us from zero-sum game capitalism to the composite interests of cooperative communities. Distributed autonomous organizations based on blockchain technology can achieve a balance of resilience and efficiency, coordination and incentives in a range of new areas. Make the network use common interests to replace the interests of current owners, and align incentives between creators, service providers and users, thereby changing current top-down network control and network organizations with high corruption propensity Architecture.

To understand where future value may arise, and where investors and entrepreneurs can effectively focus our efforts, it is necessary to understand how technology can advance the evolution of business models in the long run.

Business models derived from emerging technologies and their value capture, Source: Fabric Ventures

Let us pick the story in the post-war American boom of the 1950s and 1960s. As US companies become multinational companies, technology companies' business models are primarily designed to take advantage of the production of expensive proprietary computer hardware. The result is that computers are still made up of a small number of users, including governments, businesses, and wealthy individuals. As the production costs of microprocessors have fallen dramatically, a new computing architecture has fundamentally improved economics, shifting the power of the industry from proprietary hardware systems to chipmakers and software companies. IBM's Tom Watson couldn't imagine how many computers people would need in the future, but Microsoft's Bill Gates understood that every family would have a personal computer in the future.

With the democratization of personal computer access, a wave of new waves emerged in the 1970s and 1980s, and technology companies turned their business model into selling cheap hardware with licensed operating systems . With the strong rise of Microsoft and its relentless pursuit of winning developers, consumers have chosen the platform with the most compatible applications, the company's Windows operating system began to spread between hardware providers, and almost all packages Unified under the same roof. By 2000, Microsoft's market share exceeded 90% of all PCs sold, and it gained most of its value from the operating software and application software layers.

However, while Microsoft has the dominance of desktops, it has never protected servers in wiring closets and data centers—these are still the areas of the most successful Unix workstations in the 1980s (Sun, Silicon Graphics and IBM). In the early 1990s, Linus Torvalds attempted to undermine this expensive hegemony through cheaper and more open alternatives: Linux, the open source version of the Unix operating system for software servers. By combining commercial hardware and Linux operating systems with Apache Web Server, MySQL and PHP, a new wave of technology is realized. As of 2012, Microsoft's share of the computing market has dropped to 20%, and by 2017, Linux-based Android accounted for 85% of the mobile computing market.

This democratization of cheaper software, coupled with a widely available network, has prompted technology companies to shift their business models to offering free software and networks in order to monetize the data they collect. Today's technology giants have chosen open source software and combined it with a large number of monopolistic user data silos to create a competitive moat that protects trillions of dollars in market capitalization. However, as existing companies face increasing data usage issues, users are beginning to scrutinize ownership of their data, and the government is pushing for widespread data protection regulation (such as Europe's GDPR).

After the democratization of hardware, operating systems, software and networks, the new paradigm shift we observe will open up access to data within the network. With the collapse of existing data silos, we will observe the value of the data itself and the commercialization of access routes. But the problem still exists: Once the data monopoly is eroded or killed, what will the technology company's business model become?

The emergence of emerging technology architectures and their early adopters, Source: Fabric Ventures

To answer this question, we have revisited the history of open source software development, and the resulting motives and monetization methods. Since the beginning, the free software movement has emerged between privacy and security enthusiasts, hackers and government entities, who realize they can't commercialize their software. The movement depends to a large extent on the moral belief that software should be in an open field and accessible to everyone.

Developers realized that in addition to providing software for free, the open source model fundamentally improved the software development process. The community is built around the project, and the reputation layer begins to settle in these communities, generating exponential growth among contributors, maintainers, and users. With the ability to distribute software widely across the globe, developers are starting to form companies that add a broad monetization layer to a wide distribution network. In 1993, Bob Young founded ACC, which sold Linux and Unix accessories and later became Red Hat. At about the same time, in 1994, Monty Widenius began researching MySQL, which integrated itself into Linux, Apache, MySQL, and Python.

Over the past two decades, as big companies have realized the feasibility and advantages of open source development, the world has become dependent on open source software. React and React-native JavaScript development tools are primarily maintained by Facebook, while Google has made countless contributions to Android, Kubernetes and Go. Microsoft was considered a major open source competitor less than 20 years ago, and recently it has become the company that supports the largest number of open source developers in 2017, and recently acquired Github for $7.5 billion. These technology giants have chosen open source software, most of which have stopped charging for software, and have built their entire business in the process of monetizing around user data: by using software they don't own and not belonging to them. The data creates trillions of dollars in market capitalization.

Technology giants have created trillions of dollars in market capitalization by using software they don't own and data that doesn't belong to them.

Unfortunately, in the third wave of open source software development, developers have lost the moral aspirations and romantic motives that drive the first wave, and often do not benefit from the financial upside or reputation rewards that drive the second wave.

With the Bitcoin white paper released by Satoshi Nakamoto in 2008, we entered the fourth era of open source software development: by solving the "double flower" problem and creating "digital scarcity" in distributed systems . This fundamental architectural breakthrough enables open source networks to reward and motivate contributors without a central agency or sponsor. The open platform and the “de-trust” peer-to-peer network innovation combined with the evidence-driven incentive and governance system opened the Cambrian explosion around developers and ecosystems of open source projects.

1.4 Paradigm shift driven by three major trends

Over the past two decades, the success of the Web 2.0 era has been dominated by three basic technologies: cloud, social networking, and mobile communications. We believe that technological innovation in the coming decades will be driven by the interaction between edge computing, machine learning and distributed data : edge computing on a large number of devices captures millions of data points, and advances in machine learning algorithms have absorbed These data wealth and the foundation of distributed data enable secure and scalable communication, coordination and fair incentives in the network.

Emerging networks built on edge computing, machine learning and distributed data architecture, Source: Fabric Ventures

These three technology waves will open up a large amount of data, which is still locked for privacy, trust or competition. In 2010, the world produced about one zettabyte of data. According to McKinsey data, in 2016, the world produced 16 zettabytes of data, but only analyzed 1% of the data. By 2025, world data generation is expected to exceed 160 zettabytes. Previously undeveloped data sources, privacy protection mechanisms and more granular granularity distribution will lead to breakthroughs that are still seemingly unbelievable: for example, personalized medical prediction through genomic data, coordination of distributed autonomous agents, and data generators Unlock new untapped monetization methods.

However, if today's isolated data structures are not rapidly upgraded, the proliferation of available data and the efficiency of machine learning algorithms may soon lead us to become a dystopian future that monitors capitalism and politics. Data monopolies can not only predict but also use our Emotions "kindly" stipulate the decisions we have not made yet. In fact, they have a high degree of precision for future actions, and the goal is neither opaque nor in line with our own real decisions. And the decentralized data architecture will not only prevent technology giants and other data monopolies from gaining such tremendous power, but will also enable individual participants to improve their lives through this new wave of applications, while at the same time protecting their privacy. Can also get economic returns from participation.

1.5 Passport and the novelty of encryption economics

The fundamental problem that has plagued all network architects in history can be simplified as a mismatch between network value creation and capture of equity structure value. The value of the shareholding structure comes from the future cash flow generated by the central company's ability to generate revenue from customers with net profit. A system for companies that sell goods and services: Apple sells advanced hardware or monthly membership in Netflix and Spotify sales. However, when applied to networks with core values ​​in low-cost distribution and user-driven content creation, equity models do create dangerous divergence of interests: it is difficult for Twitter to monetize the content created by its user base, and Facebook must turn to almost dystopian. The panoramic model used to monetize its user base and open source network has never managed to properly monetize its full value. When a community generates valuable content within the network, the user transforms from the customer to the product itself. The fundamental mismatch is that a central entity attempts to capture the entire value created by the user community, and that value does not bring any economic upside returns.

By moving away from the central equity firm of the management network and using local pass to model the network as a digital economy, we can not only increase the value of capture but also assign it to the actual value creator. This digital economy uses the Pass as a manifestation of digital scarcity within the network to motivate distributed users, machines and other participants to contribute and manage valuable resources, work content and usage time. Express digital scarcity (such as computing power, labor, content creation, or governance) of the network as a digital pass, making it scalable and infinitely flexible. These passes become programmable digital software links between people and the assets they own – including virtual assets (such as personal data) and physical assets (such as real estate). What these passes achieve is the intrinsic benefits of the network for users, developers, resource providers (such as miners) and capital providers (such as investors) through the new frontier of universal economics-incentive design. The utility is cleverly built to balance. Since universalization allows for the re-imagination of ownership on an overall macro-level, even beyond pure digital assets, existing assets will have the potential to improve liquidity, transparency, and access rights, thus creating a new encrypted capital market.

A few decades ago, we witnessed a shift in data and content from analog to digital distribution. This revisits all the process of creating, distributing, and monetizing digital content. The impact of digital on newspaper, television and film content has been well documented: new giants like Netflix and Spotify have emerged, and companies like Blockbuster and Kodak have been marginalized or disappeared. And we are now convinced that while the content is digitized, ownership will be certified.

Digitalization refers to content, and universalization refers to ownership.”

1.6 Pass Type

In the world of the pass, we divide it into three core categories based on different characteristics: currency and commodity, utility pass and securities token. It is worth noting that any single pass may exhibit several features at the same time, even developing its characteristics in the life cycle of its underlying network. We summarize the views of these features:

Classification of passes based on different characteristics, source: Fabric Ventures

  • Value Memory (SoV) Pass: relies on its review of boycotts and peer-to-peer trading functions to ensure value storage that is completely unrelated to any other market, commodity or currency. For example, Bitcoin, Monero and Zcash, which have slight changes in transaction speed, network security and network privacy. These closest to the equivalent currency, when considering the quantitative theory of money, can use the exchange equation (MV = PQ) to understand their dynamics.
  • Stabilizing coins: The purpose is to separate the volatility from the set of ciphers and to provide digital assets linked to fiat currencies (such as the US dollar), mainly used as units for accounts and trading media. The three major categories include:

a) Centralized IOU issuance – maintaining price stability through a centralized reserve of centralized storage.

b) Collateral Support – Over-collateralized by cryptographic assets such as Ethernet Hosting.

c) Algorithm – Rebuild an algorithm for the central bank to control supply and demand through leverage.

  • Payment Pass: The simplest and iterative fastest version, they are often enforced into the network as the only payment method for digital assets provided by the network. As a result, they are closer to the currency of the digital economy and do not become investable, fluid or stable. On the contrary, in the future equilibrium, they will be closer to a form of working capital, and users will minimize working capital due to the opportunity cost of capital. But most of them are likely to end at a very high speed after the price goes down. Through the nature of open source code (replicable and divisible), these escrow models operate at high risk of fork and are replaced by equivalent protocols that can be paid by appropriate value storage tokens.
  • Securities Pass: A symbolic representation of assets, from traditional goods and stocks to art, to virtual land in the form of encrypted collections. The former relies on a strong guarantee of ownership of the underlying assets and can be valued by the value of its underlying assets and has a premium of liquidity, severability and accessibility. The latter usually represents scarce digital assets, such as art or real estate—the creator's reputation, the location in the digital landscape, and the overall demand for assets.
  • Governance Token: Let the holder vote on how the network works, the focus of the developer's work, and the way the software should be upgraded. As the value of the network rises – the number of companies running on it or the number of transactions it deals with – the ability to influence its development will become a scarce resource. In fact, the price of voting rights in such networks may grow exponentially at the value they determine. The pass feature is typically combined with one of the aforementioned pass designs.
  • Discount Pass: Gives the owner the right to discount when purchasing assets provided by the digital network. The purchase of a discount token can be equated with the purchase of a cooperative and the right to set a discount percentage for all economic activities within the network. As the value and activity of the network grows, holders of the pass can get a greater discount value – and effectively establish royalties (no money payments) that can only be declared in terms of network services.
  • Work Pass : Implemented by the idea that the service provider needs to activate "skin in the game" to provide high quality work for the network. Whether it is objective work such as computing resources or subjective work such as qualitative ratings, service providers are obliged to put a certain amount of certificates into the network in exchange for the right to provide profitable work. If the job is completed "correctly", the service provider will receive the fee paid by the user (not necessarily in the local pass). Conversely, if the service provider acts maliciously, their interests are cut and distributed to other service providers. As the use of the network grows, there will be more immediate and future lucrative work delivery, leading to an increase in service providers who wish to provide this work. As a result, the demand for these work passes will increase, and due to their fixed supply, the price of the pass should increase with the use of the network.
  • Burn&Mint Equilibrium: The build is based on two simple features: the network user uses the pass to pay for the service, but not the payment fee, the burning pass (in dollar terms) and the continuous inflation process through the casting of the new pass (in Native certificate pricing). Since the network user references the service provider for each burned pass, they will receive the assignment of the newly created pass as payment. Therefore, when the usage of the platform increases and the amount of the user's burning passes exceeds inflation, the supply will decrease and the price of each pass will be increased.

2. The interests of investors

The role of investors investing in these Censorship Models must evolve from a simple capital allocator to an active participant in the network. From community building, token engineering, active operation of nodes, to actively managing liquidity positions, venture capital funds active in this area will soon be required to operate in the network. It also fulfills their fiduciary duties: maximizing the return on capital of their LPs and also helping to guide the network they invest in.

The most active investors will continue to interact with the network from multiple perspectives during the investment period:

  • Staking : A Promo Proof (PoS) or Proof of Enforcement (DPoS) network runs on the main network, and the holder will be able to place their pass to provide profitable work for the network (validate transactions, calculations, arbitration, transcoding) Or provide security) and be rewarded by the network (such as block rewards) or by users (such as transaction fees). In a DPoS network, an operator in the network can use the work as a service for sharing payment, and the certificate holder can entrust/bind its certificate to the operator.
  • Voting: Many networks are using their pass as a governance tool – whether it's a simple token vote, a second vote or a mobile democracy, the pass will give the holder the right to speak. Long-term investors will participate in the governance process of the network and direct it to the best interests.
  • Planning : Using the Certified Planning Form (TCR), early investors in such networks need to actively participate in the curatorial process while maintaining the high quality of the registry and making informed engagements on relevant development agreements.
  • Running nodes and using the network: Since investors may also be users of the network, they may actively build early use case iterations for the network. Starting from the operational nodes in the network, and actively participating in the network economy (such as purchasing services/assets) for their own data-driven tracking purposes, these investors will initially grow an ecosystem built on the network.

Therefore, in order to achieve more valuable investor returns, the logic of investment institutions will also be transformed into providing more value to these networks over time.

First, the value of early investment will become more prominent. While the previous wave of technology has enabled new business models to thrive on new technology platforms, the shift to distributed data networks has allowed monetization in underlying protocols. Therefore, contrary to the previous technology wave, more early value creation can be concentrated in the technical infrastructure layer built by developers.

Second, consumer-oriented applications must be iterated to the right amount of availability in the shortest amount of time, and significant technological advances between Web 2.0 and Web 3.0 need to be integrated into the context of a successful application. Consumers, in turn, will overcome some of the psychological barriers in use cases that have not previously felt natural, and adopt new habits. New products and new markets will be supported by new market strategies and distribution formats – such as financial incentives, pass-throughs and distributed app stores.

We believe that, especially in the next 2-3 years, most interesting projects will focus on the infrastructure layer: developer tools, serving teams in areas such as smart contracts and data management frameworks; “water supply and shovel” will promote business Institutional investors transition to Web 3.0 and provide basic protocols including computing, storage and data privacy.

As a result, investors will focus on sourcing innovative projects from developers, reviewing their Github submissions, developing and adopting leading metrics, and evaluating their suitability in a distributed application stack. In the process of continuous transformation, the biggest investment opportunities in the future may be discovered by participating in (and contributing to) hackathons. In an era when software is becoming more popular, developers are the next king, and as an investment institution, it is worthwhile to spend time with them.

While it is undeniable that Bitcoin introduces a viable digital storage value as a substitute for government-issued currency and gold, by combining limited supply with immutable ledgers, we believe this wave of technology will bring more More new opportunities, not just an extra form of money. In the financial sector, we believe that compliance will automatically be incorporated into any asset transfer function, credit scores and premiums will be dynamically adapted to the myriad of data sources worldwide, and the overall concept of capital markets will be transformed into a modular language as a financial language: A group of open and unprivileged data modules that can use datasets in any financial application and can use datasets in any imaginable asset.

There are also opportunities to build industries from scratch, such as supply chain management. From inventory tracking and provenance from multiple providers to automated credit financing and auditing. Especially in the automotive sector, supporting cross-vehicle manufacturers in car and car data sharing and vehicle tracking opens up a whole new set of possible operations and interactions. For the first time, Tonghua has achieved global ownership of digital assets—data begins with simple encryption collection and inevitably moves toward sovereign identity and authentication. Data, software licenses and job providers' peer-to-peer markets can thrive in their real form – without AirBnB or Uber cutting profits for every transaction.

By abstracting trust, the intermediates we observe today are gradually disappearing. Decentralized networks will not only fundamentally change our perception of how all existing industries operate, but also introduce new business models.

As mentioned earlier, with regard to the role of investors in this area, we have seen opportunities for capital providers to change fundamentally. Active network participation can take the form of verifying blocks in the network certificate in exchange for block rewards. In addition, it can focus on providing the network with the resources it needs in exchange for the fees paid by users, such as storage, data or registry management. By designing appropriate incentives in these networks, individuals and professional service providers will quickly provide services for rewards and fees. In the case of a competitively dispersed network, the network provider will follow the network participants, and these network participants may gather to a network with numerous providers (which may lead to chicken and egg cases).

In addition, in the early days of the network, when viewed in a vacuum, there was a risk that network participation would not be economically viable: relying on transaction fees in a network without transactions or providing data stream loss to the network before any buyer generated it. In addition to the early investors, especially the scenes that no rational actors will satisfy. The return of venture capital funds is primarily driven by maximizing the chances of exceptional success for each investment, so the potential in such networks exceeds the sunk costs of configuration and seeding networks.

However, the liquidity market can indeed bring the combination of institutions to the market earlier. At the end of the fund's life cycle, unlike traditional stocks, the license sales will be easier and will not have a negative impact on the project. There is no need for future institutions to promote initial public offerings or acquisitions, but to be able to trade to certified investors on the open market, which are fully regulated and may even include a chain compliance framework while retaining the sale to them. The right to choose the card. In addition, large buyers in OTC sales. There is also the opportunity to make way for capitalists with different risks/rewards or strategic players within the network, which may ultimately increase the value of the remaining passes.

in conclusion

The current movement is not just the intergenerational transformation of the network architecture, but the misplacement of organizational principles. A new wave of people-oriented services will be intertwined with our daily lives, the unprecedented intimacy of people and the Internet, and the belief that their machine opponents do not abuse the growing accessibility of data, we need this layer of encryption privacy architecture and Appropriate incentives. At the same time, we believe that starting with the technology infrastructure, developer tools and data management framework, building a scalable, secure and privacy-protected Web 3.0 has a long way to go.

Author: Fabric Ventures

Original link: https://medium.com/fabric-ventures/the-fabric-ventures-investment-thesis-6cd08684b467

Translation: Daxie Think Tank

Disclaimer: Reproduction is strictly prohibited without permission. Please pay attention, don't get lost, thank you.

We will continue to update Blocking; if you have any questions or suggestions, please contact us!


Was this article helpful?

93 out of 132 found this helpful

Discover more