आइकॉन_इंस्टॉल_आईओएस_वेब आइकॉन_इंस्टॉल_आईओएस_वेब आइकन_इंस्टॉल_एंड्रॉइड_वेब

आईओएसजी वेंचर्स: उभरते एआई x वेब3 उद्योग मानचित्र की खोज

विश्लेषण6 महीने पहले发布 6086सीएफ...
89 0

मूल लेखक: आईओएसजी वेंचर्स

1

At first glance, AI x Web3 appear to be separate technologies, each based on fundamentally different principles and serving different functions. However, a deeper look reveals that the two technologies have the opportunity to balance each other鈥檚 trade-offs, and each鈥檚 unique strengths can complement and enhance each other. Balaji Srinivasan brilliantly articulated this concept of complementary capabilities at the SuperAI conference, inspiring a detailed comparison of how these technologies interact.

आईओएसजी वेंचर्स: उभरते एआई x वेब3 उद्योग मानचित्र की खोज

Tokens took a bottom-up approach, emerging from the decentralization efforts of anonymous cyberpunks, and evolving over more than a decade through the collaborative efforts of numerous independent entities around the world. In contrast, AI was developed through a top-down approach, dominated by a handful of tech giants. These companies set the pace and dynamics of the industry, and barriers to entry are determined more by resource intensity than technological complexity.

The two technologies also have very different natures. In essence, tokens are deterministic systems that produce unchangeable results, such as the predictability of hash functions or zero-knowledge proofs. This is in stark contrast to the probabilistic and generally unpredictable nature of artificial intelligence.

Similarly, cryptography excels at verification , ensuring the authenticity and security of transactions and establishing trustless processes and systems, while AI focuses on generation , creating rich digital content. However, in the process of creating digital richness, ensuring the source of content and preventing identity theft becomes a challenge.

Fortunately, tokens provide the opposite concept of digital abundance: digital scarcity . They provide relatively mature tools that can be extended to artificial intelligence technology to ensure the authenticity of content sources and avoid identity theft issues.

A significant advantage of tokens is their ability to attract large amounts of hardware and capital into a coordinated network to serve a specific purpose. This ability is particularly beneficial for artificial intelligence, which consumes a lot of computing power. Mobilizing underutilized resources to provide cheaper computing power can significantly improve the efficiency of artificial intelligence.

By comparing these two technologies, we can not only appreciate their individual contributions, but also see how they work together to forge new paths in technology and economics. Each technology can complement the shortcomings of the other to create a more integrated and innovative future. In this blog post, we aim to explore the emerging AI x Web3 industry landscape, focusing on some emerging verticals at the intersection of these technologies.

आईओएसजी वेंचर्स: उभरते एआई x वेब3 उद्योग मानचित्र की खोज

स्रोत: आईओएसजी वेंचर्स

2

2.1 Computational Network

The industry map begins with an introduction to computing networks that attempt to address the limited GPU supply problem and try to reduce computing costs in different ways. The following are worth focusing on:

  • Non-Uniform GPU Interoperability : This is a very ambitious endeavor with high technical risk and uncertainty, but if successful, it will have the potential to create results of enormous scale and impact, making all computing resources interchangeable. Essentially, the idea is to build compilers and other prerequisites so that on the supply side any hardware resource can be plugged in, and on the demand side, the non-uniformity of all hardware will be completely abstracted so that your computing request can be routed to any resource in the network. If this vision succeeds, it will reduce the current reliance on CUDA software that is completely dominated by AI developers. Despite the high technical risks, many experts are highly skeptical about the feasibility of this approach.

  • High-performance GPU aggregation : Integrate the worlds most popular GPUs into a distributed and permissionless network without worrying about interoperability issues between non-uniform GPU resources.

  • Commodity consumer GPU aggregation : refers to the aggregation of some lower-performance GPUs that may be available in consumer devices, which are the most underutilized resources on the supply side. It caters to those who are willing to sacrifice performance and speed for a cheaper and longer training process.

2.2 Training and Inference

Compute networks are used for two main functions: training and inference. The demand for these networks comes from Web 2.0 and Web 3.0 projects. In the Web 3.0 space, projects like Bittensor use computing resources for model fine-tuning. In terms of inference, Web 3.0 projects emphasize the verifiability of the process. This focus has given rise to verifiable reasoning as a market vertical, where projects are exploring how to integrate AI reasoning into smart contracts while maintaining the principle of decentralization.

2.3 Intelligent Agent Platform

Next up is the Intelligent Agent Platforms . The graph outlines the core problems that startups in this category need to solve:

  • Agent interoperability and discovery and communication capabilities : Agents can discover and communicate with each other.

  • Agent cluster building and management capabilities : Agents can form clusters and manage other agents.

  • Ownership and markets for AI agents : Provide ownership and markets for AI agents.

These characteristics highlight the importance of flexible and modular systems that can be seamlessly integrated into a variety of blockchain and AI applications. AI agents have the potential to revolutionize the way we interact with the Internet, and we believe that agents will leverage infrastructure to support their operations. We envision AI agents relying on infrastructure in the following ways:

  • Access real-time web data using a distributed crawling network

  • Using DeFi channels for inter-agent payments

  • Requiring an economic deposit is not only to penalize misbehavior when it occurs, but also to improve the discoverability of the agent (i.e. using the deposit as an economic signal during the discovery process)

  • Using consensus to decide which events should result in slashing

  • Open interoperability standards and agent frameworks to support building composable collectives

  • Evaluate past performance based on immutable data history and select the right collective of agents in real time

आईओएसजी वेंचर्स: उभरते एआई x वेब3 उद्योग मानचित्र की खोज

स्रोत: आईओएसजी वेंचर्स

2.4 Data Layer

In the convergence of AI x Web3, data is a core component. Data is a strategic asset in the AI competition and a key resource along with computing resources. However, this category is often overlooked because most of the industrys attention is focused on the computing level. In fact, primitives provide many interesting value directions in the data acquisition process, mainly including the following two high-level directions:

  • Accessing public internet data

  • Accessing protected data

Accessing public Internet data: This direction aims to build distributed crawler networks that can crawl the entire Internet in a few days, obtain massive data sets, or access very specific Internet data in real time. However, to crawl large data sets on the Internet, the network requirements are very high, and at least a few hundred nodes are needed to start some meaningful work. Fortunately, Grass, a distributed crawler node network, already has more than 2 million nodes actively sharing Internet bandwidth to the network with the goal of crawling the entire Internet. This shows the great potential of economic incentives in attracting valuable resources.

While Grass provides a level playing field in terms of public data, there is still a challenge in leveraging the underlying data – namely, the access problem of proprietary datasets. Specifically, there is still a large amount of data that is kept in a privacy-preserving manner due to its sensitive nature. Many startups are leveraging some cryptographic tools that enable AI developers to build and fine-tune large language models using the underlying data structures of proprietary datasets while keeping sensitive information private.

Technologies such as federated learning, differential privacy, trusted execution environments, fully homomorphic, and multi-party computation provide different levels of privacy protection and trade-offs. Bagel鈥檚 research article (https://blog.bagel.net/p/with-great-data-comes-great-responsibility-d67) summarizes an excellent overview of these technologies. These technologies not only protect data privacy during machine learning, but also enable comprehensive privacy-preserving AI solutions at the computational level.

2.5 Data and Model Sources

Data and model provenance techniques aim to establish processes that can assure users that they are interacting with the expected models and data. In addition, these techniques provide guarantees of authenticity and origin. Take watermarking technology as an example. Watermarking is one of the model provenance technologies. It embeds a signature directly into the machine learning algorithm, more specifically directly into the model weights, so that at retrieval time it can be verified that the inference comes from the expected model.

2.6 Application

In terms of applications, the design possibilities are endless. In the industry landscape above, we have listed some particularly exciting developments as AI technologies are applied in the field of Web 3.0. Since most of these use cases are self-descriptive, we will not comment on them here. However, it is worth noting that the intersection of AI and Web 3.0 has the potential to reshape many verticals in the field, as these new primitives provide developers with more freedom to create innovative use cases and optimize existing use cases.

संक्षेप

The AI x Web3 convergence brings a vision full of innovation and potential. By leveraging the unique strengths of each technology, we can solve a variety of challenges and open up new technological paths. As we explore this emerging industry, the synergy between AI x Web3 can drive progress and reshape our future digital experiences and the way we interact on the web.

The convergence of digital scarcity and digital abundance, the mobilization of underutilized resources to achieve computational efficiency, and the establishment of secure, privacy-preserving data practices will define the next era of technological evolution.

However, we must recognize that the industry is still in its infancy and the current landscape may become obsolete in a short period of time. The rapid pace of innovation means that todays cutting-edge solutions may soon be replaced by new breakthroughs. Nevertheless, the foundational concepts explored – such as computing networks, proxy platforms, and data protocols – highlight the huge possibilities of the convergence of AI and Web 3.0.

This article is sourced from the internet: IOSG Ventures: Exploring the Emerging AI x Web3 Industry Map

Related: LUCIDA: How do Crypto quantitative fund managers obtain Alpha?

This AMA topic and guest introduction Topic: How do Crypto quantitative fund managers obtain Alpha? host Zheng Naiqian @ZnQ_ 626 LUCIDA Founder 2019 Bgain Digital Asset Trading League Season 1 Mixed Strategy Group Champion; 2020 TokenInsight Global Asset Quantitative Competition, runner-up in April, champion in May, and third place in the composite strategy group; 2021 TokenInsight x KuCoin Global Asset Quantitative Competition, third place in the compound strategy group season; Guests Ruiqi @ShadowLabsorg Founder of ShadowLabs Investment Director of DC Capital Quantitative product management scale exceeds US$300 million Market making consultant for many exchanges and well-known projects Wizwu @wuxiaodong 10 RIVENDELL CAPITAL Multi-factor subjective strategy fund manager Computer + Finance composite background 20M non-traditional crypto strategies Focus on on-chain and off-chain data mining and neutral multi-factor strategies What does the…

© 版权声明

相关文章