Nearly 10,000 words of research report! From the first principles, how SCP and AO affect the world on the chain
1. From Bitcoin to Ethereum, how to find the best path to break through the limitations of throughput and scenarios?
2. Starting from the first principles, how to find the key to breaking through the memes in the market and find the most essential basic needs of blockchain?
3. What magic do the disruptive innovation principles of SCP and AO (Actor Oriented) (separating storage and computing) have that can allow Web3 to completely let go?
4. Will the results of a deterministic program operating on immutable data be unique and reliable?
5. In such a narrative, why can SCP and AO (Actor Oriented) become hexagonal fighters with unlimited performance, data credibility and composability?
Giới thiệu
[Data source: BTC price]
It has been more than 15 years since the birth of blockchain in 2009. As a paradigm revolution in digital technology, it records digital value and network value, making mật mãcurrency an innovation in a new capital paradigm.
As the eldest son, Bitcoin is expected to become a strategic reserve asset. At the 2024 Bitcoin Conference:
Trump has promised that if he returns to the White House, he will ensure that the government retains 100% of its Bitcoin and lists it as a strategic reserve asset for the United States.
After Trump won the election, Bitcoin rose 150% and reached a peak of $107,287.
Trumps victory is obviously more beneficial to the crypto industry, as Trump has repeatedly expressed strong support for cryptocurrencies.
However, in the short term, the high sensitivity of cryptocurrencies to election results may lead to short-term volatility peaks in the market. Will this strong upward momentum be sustainable? The author believes that only after eliminating uncertainty and improving the scalability of blockchain can a new red ocean be ushered in.
The haze behind the prosperity of Web3 after the US election
[Data source: DefiLlama]
Fading from the spotlight, the TVL of Ethereum, the second largest digital currency in the digital currency market, has continued to be sluggish since reaching its historical peak in 2021.
Even in the third quarter of 2024, Ethereums decentralized finance (DeFi) revenue fell to $261 million, the lowest level since the fourth quarter of 2020.
At first glance, there may appear to be occasional spikes, but the overall trend suggests a slowdown in overall DeFi activity on the Ethereum network.
In addition, the market has also seen the emergence of some completely alternative public chains dedicated to trading scenarios, such as the recently very popular hyperliquid, which is an order book model trading chain. The overall data has grown rapidly, and it has entered the top 50 in terms of market value in 2 weeks. The estimated annualized revenue is only lower than Ethereum, Solana, and Tron among all public chains, which indirectly reflects the weakness of traditional DeFi based on the AMM architecture and Ethereum.
[Data source: Compound trading volume]
[Data source: Uniswap transaction volume]
DeFi was once the core highlight of the Ethereum ecosystem, but its revenue has fallen sharply due to reduced transaction fees and user activity.
In this regard, the author tries to think about what is the reason for the current difficulties faced by Ethereum or the entire blockchain, and how to break the deadlock?
Coincidentally, with the success of SpaceXs fifth test launch, SpaceX has become a rising star in commercial aerospace. Looking back on the development of SpaceX, it has come to where it is today thanks to the key methodology – the first principle (Tips: The concept of the first principle was first proposed by the ancient Greek philosopher Aristotle 2,300 years ago. He described the first principle as follows: In the exploration of each system, there is a first principle, which is a most basic proposition or assumption that cannot be omitted or deleted, nor can it be violated ).
Then, let us also use the first principles method to peel off the fog layer by layer and explore the most essential atoms of the blockchain industry. From a fundamental perspective, we can re-examine the difficulties and opportunities currently facing this industry.
Web3’s “cloud service”: a step backward or the future?
When the concept of AO (Actor Oriented) was introduced, it attracted widespread attention. In the context of many EVM series blockchain public chains tending to be homogenized, AO, as a subversive architectural design, has shown unique appeal.
This is not just a theoretical idea, but a team is putting it into practice.
As mentioned above, the greatest value of blockchain is that it records digital value. From this perspective, it is an open and transparent global public ledger. Therefore, based on this essence, the first principle of blockchain can be considered as a kind of storage.
AO is implemented based on the storage consensus paradigm (SCP). As long as the storage is immutable, no matter where the calculation is performed on the computing end, the result can be guaranteed to have consensus. The AO global computer is born, realizing the interconnection and collaboration of large-scale parallel computers.
Looking back at 2024, one of the most eye-catching events in the Web3 field is the outbreak of the inscription ecosystem, which can be seen as a practice of the early storage and computing separation model. For example, the etching technology used by the Runes protocol allows a small amount of data to be embedded in Bitcoin transactions. Although this data does not affect the main function of the transaction, it constitutes a clear verifiable and unconsumable output as additional information.
Despite its early days, some technology observers have questioned the security of Bitcoin’s inscriptions, fearing that it could be a potential entry point for cyberattacks.
However, for two years, it has stored data entirely on-chain, and no blockchain forks have occurred so far. This stability once again proves that as long as the stored data is not tampered with, data consistency and security can be guaranteed no matter where the computing end is performing the calculation.
Perhaps you will find that this is almost the same as traditional cloud services. For example:
In terms of computing resource management, in the AO architecture, Actor is an independent computing entity and each computing unit can run its own environment. Isnt this exactly the same as the microservices and Docker of traditional cloud servers? Similarly, traditional cloud services can rely on S3 or NFS for storage, while AO relies on Arweave.
However, it is not accurate to simply attribute AO to a rehash of an old idea. Although AO borrows some design concepts from traditional cloud services, its core lies in combining decentralized storage with distributed computing. As a decentralized storage network, Arweave is fundamentally different from traditional centralized storage. This decentralized feature gives Web3 data higher security and censorship resistance.
More importantly, the combination of AO and Arweave is not a simple technology stack, but creates a new paradigm. This paradigm combines the performance advantages of distributed computing with the credibility of decentralized storage, providing a solid foundation for the innovation and development of Web3 applications. Specifically, this combination is mainly reflected in the following two aspects:
1. While achieving a fully decentralized design in the storage system, it relies on a distributed architecture to ensure performance.
2. This combination not only solves some core challenges in the Web3 field (such as storage security and openness), but also provides a technical foundation for possible unlimited innovations and combinations in the future.
The following article will explore in depth the concept and architectural design of AO, and analyze how it copes with the difficulties faced by existing public chains such as Ethereum, ultimately bringing new development opportunities to Web3.
Looking at the current dilemma of Web3 from the perspective of atoms
Since the emergence of Ethereum with smart contracts, Ethereum has become the undisputed king.
Some people may ask, isnt there Bitcoin? But it is worth noting that Bitcoin was created as a substitute for traditional currency and is intended to be a decentralized and digital cash system. Ethereum is not only a cryptocurrency, but also has the ability to create and implement smart contracts and decentralized applications (DApps).
Nói chung, Bitcoin is a digital alternative to traditional currencies with a higher price, but that does not mean it has a higher value. Ethereum is more like an open source platform. From the perspective of richness, it has expected value and can better represent the open world of Web3 in the current concept.
Therefore, since 2017, many projects have tried to challenge Ethereum, but only a few have survived to the end. However, Ethereum’s performance has been criticized, so the growth of Layer 2 has followed. Behind the seemingly prosperous Layer 2 is a helpless struggle in a difficult situation. As the competition intensifies, a series of problems have gradually been exposed, becoming a serious shackle for the development of Web3:
Performance is limited, and user experience is poor
[Data source: DeFiLlama]
[Data source: L2 BEAT]
Recently, more and more people believe that Ethereum’s expansion plan Layer 2 is a failure.
Initially, L2 was an important continuation of Ethereum’s subculture in Ethereum’s expansion plan. There was also a need for many people to support L2’s development route, hoping to reduce Gas fees and increase throughput through L2 to achieve growth in the number of users and transactions. However, the expected increase in the number of users did not come as Gas fees were reduced.
In fact, is L2 responsible for the failure of the expansion plan? In fact, it is obvious that L2 is just a scapegoat. It is true that it bears part of the responsibility, but the main responsibility still lies with Ethereum. To put it more concretely, it is the inevitable result caused by the problems in the underlying design of most of the current Web3 chains.
We explain this issue from the perspective of atoms. L2 itself is responsible for computing, while the storage of the blockchain is undertaken by Ethereum. In order to obtain sufficient security, Ethereum must also be responsible for storing and consensus on data.
However, Ethereum itself is designed to avoid possible infinite loops in the execution process, which could cause the entire Ethereum platform to stop, so any given smart contract execution is limited to a limited number of computational steps.
This further leads to the fact that the design of L2 expects unlimited performance, but in reality the upper limit of the main chain puts shackles on it.
The short board effect determines that there is a ceiling for L2.
For detailed mechanisms, readers can read further: From Traditional DeFi to AgentFi: Exploring the Future of Decentralized Finance .
The gameplay is limited and it is difficult to form effective attraction
What Ethereum is most proud of is the prosperous ecosystem at the application layer. In Ethereums application ecosystem, there are various DApps.
But is there really a scene of flourishing flowers behind the prosperity?
The author believes that this is obviously not the case. Behind Ethereum’s prosperous application ecosystem is a single situation where financialization is serious and non-financial applications are far from mature.
Next, let’s take a look at the more prosperous application sectors on Ethereum.
First of all, although concepts such as NFT, DeFi, GameFi and SocialFi have exploratory significance for financial innovation, such products are not currently suitable for the general public. The reason why Web2 can develop so rapidly is that its functions are close enough to peoples daily lives.
Compared with financial products and services, ordinary users are more concerned about messaging, social, video, e-commerce and other functions.
Secondly, from a competitive perspective, credit loans in traditional finance are a very common and widespread product, but in the DeFi field, there are still relatively few products of this type. The main reason is the lack of an effective on-chain credit system.
The construction of a credit system needs to allow users to truly own their own online profiles and social graphs, and be able to span different applications.
Only when these decentralized information can be stored and transmitted at zero cost, is it possible to build a powerful personal information graph of Web3 and a set of Web3 applications based on the credit system.
Since then, we have once again clarified a key issue: L2’s failure to attract enough users is not its problem. The existence of L2 has never been the core driving force. The real way to break through the shackles of Web3’s dilemma is to innovate application scenarios to attract users.
But the current situation is like the highway during holidays. Limited by the limitations of transaction performance, no matter how many innovative ideas there are, it is difficult to promote them.
The essence of blockchain itself is storage. When storage and computing are coupled, it becomes less atomic. Under this less-essential design, there must be a critical point in performance.
Some viewpoints bất chấpne the essence of blockchain as a transaction platform, a currency system, or an emphasis on transparency and anonymity. However, this view ignores the fundamental characteristics of blockchain as a data structure and its broader application potential. Blockchain is not just for financial transactions. Its technical architecture allows it to be applied across multiple industries, such as supply chain management, medical health records, and even copyright management. Therefore, the essence of blockchain lies in its ability as a storage system, not only because it can safely store data, but also because it guarantees the integrity and transparency of data through a distributed consensus mechanism. Once each data block is added to the chain, it is almost impossible to change or delete it.
Atomized infrastructure: AO makes unlimited performance possible
[Data source: L2 TPS]
The basic architecture of blockchain faces an obvious bottleneck: the limitation of block space. Like a fixed-size ledger, every transaction and data needs to be recorded in a block. Ethereum and other blockchains are subject to block size limits, causing transactions to compete with each other for space. This raises a key question: Can we break through this limit? Does block space have to be limited? Is there a way to make the system truly infinitely scalable?
Although Ethereums L2 route has been successful in performance expansion, it can only be said to be half successful, because L2 has increased throughput by several orders of magnitude, which may be able to support a project when facing a transaction peak, but for most L2 storage and consensus security inheritance chains, this expansion improvement is far from enough.
It is worth noting that the TPS of L2 cannot be infinitely improved, mainly due to the following factors: data availability, settlement speed, verification cost, network bandwidth, and contract complexity. Although Rollup optimizes the storage and computing requirements of L1 through compression and verification, data still needs to be submitted and verified on L1, and is therefore limited by the bandwidth and block time of L1. At the same time, the computational overhead of generating zero-knowledge proofs, node performance bottlenecks, and the execution requirements of complex contracts also limit the upper limit of L2 expansion.
[Data source: suiscan TPS]
Currently, the real challenge for Web3 lies in the lack of throughput and applications, which will make it difficult to attract new users, and Web3 may face the risk of losing its influence.
In short, the improvement of throughput is the key to whether Web3 can have a bright future. Realizing a network that can be infinitely expanded and high-throughput is the vision of Web3. For example, Sui uses deterministic parallel processing to pre-arrange transactions to avoid conflicts, thereby improving the predictability and scalability of the system. This enables Sui to process more than 10,000 transactions per second (TPS). At the same time, Suis architecture allows the network throughput to be increased by adding more verification nodes, theoretically achieving unlimited expansion. The Narwhal and Tusk protocols are used to reduce latency, enabling the system to efficiently process transactions in parallel, thereby overcoming the expansion bottleneck of traditional Layer 2 solutions.
The AO we are discussing is also based on this idea. Although they have different focuses, they are both building a scalable storage system.
Web3 requires a new infrastructure based on first principles and storage as the core. Just as Elon Musk did when he rethought the rocket launch and electric car industries, he fundamentally redesigned these complex technologies through first principles, thus disrupting the industry. AOs design is similar. It decouples computing and storage, abandons the framework of traditional blockchain, builds a future-oriented Web3 storage foundation, and promotes Web3s vision of moving towards a decentralized cloud service.
Storage Consensus-Based Design Paradigm (SCP)
Before introducing AO, we need to talk about the relatively new SCP design paradigm.
SCP may be unfamiliar to most people, but I believe everyone is familiar with Bitcoins inscription. Strictly speaking, the design idea of the inscription is, to some extent, a design idea that takes storage as an atomic unit, and perhaps it has some deviations.
Interestingly, Vitalik once expressed his intention to become the Web3 tape, and the SCP paradigm is exactly this kind of thought.
In the Ethereum model, calculations are performed by complete nodes, which are then stored globally and provided for query. This leads to a problem. Although Ethereum is a world-class computer, it is a single-threaded program. All steps can only be performed one by one, which is obviously inefficient. At the same time, it is also excellent soil for MEV. After all, transaction signatures will enter the Ethereum memory pool and be publicly disseminated, and then sorted and blocked by miners. Although this process may only take 12 seconds, in this short period of time, the transaction content has been exposed to countless hunters. They can quickly intercept and simulate, and even reversely deduce possible transaction strategies. For more information about MEV, please read: MEV landscape one year after the Ethereum merger
In contrast, the idea of SCP is to separate computing and storage. Perhaps this sounds a bit abstract to you, but it’s okay. Let’s take the Web2 scenario as an example.
In the process of chatting and shopping on Web2, there are often sudden peak traffic at certain times. However, it is difficult for a single computer to support such a large load in terms of hardware resources. For this reason, smart engineers have proposed the concept of distribution, which assigns the calculation to multiple computers, and finally synchronizes and stores their respective calculation states. In this way, it can be flexibly expanded to cope with traffic at different times.
The similar SCP can also be seen as such a design, which distributes the calculation to each computing node. The difference is that the storage of SCP is not a database such as MySQL or Postsql, but relies on the main network of the blockchain.
Nói ngắn gọn, SCP uses blockchain to store status results and other data, thereby ensuring the credibility of the stored data and realizing a high-performance network layered with the underlying blockchain.
More specifically, the blockchain is only used for data storage in SCP, while the off-chain client/server is responsible for performing all calculations and storing all generated states . This architectural design significantly improves performance and scalability, but can we truly guarantee the integrity and security of data in an architecture where computing and storage are separated?
In simple terms, blockchain is mainly used to store data, while the actual computing work is done by servers off-chain. This new system design has an important feature: it no longer uses the complex node consensus mechanism of traditional blockchain, but instead carries out all consensus processes off-chain.
What are the benefits of doing this? Because there is no need for a complex consensus process, each server only needs to focus on processing its own computing tasks. This allows the system to handle an almost unlimited number of transactions and has lower operating costs.
Although this design is somewhat similar to the currently popular Rollup expansion solution, its goal is bigger: it is not only used to solve the blockchain expansion problem, but also to provide a new path for the transition from Web2 to Web3.
Having said so much, what are the advantages of SCP? SCP decouples computing and storage. This design not only improves the flexibility and composability of the system, but also lowers the development threshold and effectively solves the performance limitations of traditional blockchains while ensuring the credibility of data. Such innovations make SCP an efficient and scalable infrastructure that empowers the decentralized ecosystem of the future.
1. Composability : SCP places calculations off-chain, which does not pollute the essence of the blockchain and allows the blockchain to maintain its atomic properties. At the same time, the calculations are off-chain, and the blockchain only bears the functional properties of storage, which means that any smart contract can be executed, and the migration of applications based on SCP becomes extremely simple, which is very important.
2. Low development barriers : Off-chain computing means that developers can use any language for development, whether it is C++, Python or Rust, without having to use EVM specifically to write in Solidity. The only cost for programmers may be the cost of the API for interacting with the chain.
3. No performance limit : Off-chain computing directly aligns computing power with traditional applications. The upper limit of performance depends on the machine performance of the computing server. Then, the elastic expansion of traditional computing resources is a very mature technology. It does not consider the cost of the computing machine. The computing power is unlimited.
4. Trusted data : Since the basic function of storage is undertaken by the blockchain, all data is immutable and traceable. Any node can pull data and recalculate if it doubts the status result. Therefore, the blockchain gives the data a trustworthy feature.
Bitcoin proposed the PoW solution to the Byzantine Generals Problem. This was Satoshi Nakamotos approach to breaking conventional thinking in the environment at the time, which led to the success of Bitcoin.
Similarly, when faced with the calculation of smart contracts, we start from the first principles. Perhaps this is a solution that seems to go against common sense, but when we boldly delegate the computing function and return the blockchain to its essence, we suddenly find that while the storage consensus is satisfied, it also meets the characteristics of data open source and credible supervision, and obtains the same excellent performance as Web2. This is SCP.
The combination of SCP and AO: freed from the shackles
After saying so much, AO is finally here.
First of all, AOs design adopts a pattern called Actor Model, which was originally used in the Erlang programming language.
At the same time, AOs architecture and technology are based on the SCP paradigm, which separates the computing layer from the storage layer, making the storage layer permanently decentralized while the computing layer maintains the traditional computing layer model.
AOs computing resources are similar to traditional computing models, but it adds a permanent storage layer to make the computing process traceable and decentralized.
At this point, you may wonder, which main chain is the storage layer used by AO?
Obviously, the main chain used as the storage layer cannot be Bitcoin or Ethereum. The reason has been discussed above, and I believe that readers can easily figure it out. The data storage and final verifiability of AOs final calculation are handled by Arweave.
So among so many decentralized storage tracks, why choose Arweave?
The choice of Arweave as the storage layer is mainly based on the following considerations: Arweave is a decentralized network focused on permanent data storage, and its positioning is similar to a global hard drive that never loses data, which is different from Bitcoins global ledger and Ethereums global computer. Arweave is similar to a global hard drive that never loses data.
For more technical details about Arweave, please refer to: Understanding Arweave: Critical Infrastructure of Web3
Next, lets focus on discussing the principles and technologies of AO and see how AO achieves infinite computing?
[Data source: How ao Messenger works | Manual]
The core of AO is to build a computing layer that is infinitely scalable and has no environmental dependence. The various nodes of AO collaborate based on protocols and communication mechanisms, so that each node can provide the best service and avoid competition consumption.
First, lets take a look at the basic architecture of AO. AO is composed of two basic units: processes and messages, and scheduling units (SU), computing units (CU), and messenger units (MU):
-
Process: The computing unit of a node in the network, used for corresponding data calculation and message processing. For example, each contract can be a process.
-
Message: Processes interact with each other through messages. Each message is ANS-104 standard data, and the entire AO must comply with this standard.
-
Scheduling Unit (SU): Responsible for numbering the messages of the process so that the processes can be sorted, and responsible for uploading the messages to Arweave.
-
Computation Unit (CU): The status node in the AO process, responsible for executing the computing task and returning the calculation results and signatures to the SU to ensure the correctness and verifiability of the calculation results.
-
Messenger Unit (MU): Routing exists in the node and is responsible for delivering the users message to the SU and performing integrity verification on the signed data.
It is worth noting that AO has no shared state, only holographic state. The consensus of AO is generated by the game. Since the state generated by each calculation will be uploaded to Arweave, the verifiability of the data is guaranteed. When a user questions a certain data, he can request one or more nodes to calculate the data on Arweave. Once the settlement results are inconsistent, the corresponding dishonest nodes will be fined.
Innovations in the AO Architecture: Storage and Holographic State
The innovation of the AO architecture lies in its data storage and verification mechanism, which replaces the redundant computation and limited block space in traditional blockchains by leveraging decentralized storage (Arweave) and holographic states.
1. Holographic state : In the AO architecture, the holographic state generated by each calculation will be uploaded to the decentralized storage network (Arweave). This holographic state is not just a simple record of transaction data, it contains the complete state and related data of each calculation. This means that every calculation and result will be permanently recorded and can be verified at any time. As a data snapshot, the holographic state provides a distributed and decentralized data storage solution for the entire network.
2. Storage Verification : In this mode, data verification no longer relies on each node to repeatedly calculate all transactions, but rather confirms the validity of transactions by storing and comparing data uploaded to Arweave. When the calculation results generated by a node do not match the data stored on Arweave, the user or other nodes can initiate a verification request. At this point, the network will recalculate the data and check the storage records in Arweave. If the calculation results are inconsistent, the node will be punished to ensure the integrity of the network.
3. Breaking through the block space limitation : The block space of traditional blockchain is subject to storage limitations, and each block can only contain a limited number of transactions. In the AO architecture, data is no longer stored directly in the block, but uploaded to a decentralized storage network (such as Arweave). This means that the storage and verification of the blockchain network no longer depends on the size of the block space, but is shared and expanded through decentralized storage. The capacity of the blockchain system is therefore no longer directly limited by the block size.
The block space limit of blockchain is not unbreakable. The AO architecture changes the way data is stored and verified in traditional blockchains by relying on decentralized storage and holographic state, thus making it possible to achieve unlimited expansion.
Does consensus necessarily rely on redundant computation?
Not necessarily. Consensus mechanisms do not necessarily rely on redundant computations, and can be implemented in a variety of ways. Solutions that rely on storage rather than redundant computations are also feasible in some scenarios, especially when the integrity and consistency of data can be guaranteed by storage verification.
In AOs architecture, storage becomes a way to replace redundant computing. By uploading the calculation results to a decentralized storage network (here is Arweave), the system can ensure the immutability of the data, and through the holographic upload of the state, any node can check the calculation results at any time to ensure the consistency and correctness of the data. This method relies on the reliability of data storage, rather than the results of repeated calculations by each node.
Lets take a look at the difference between AO and ETH through a table:
It is not difficult to find that the core characteristics of AO can be summarized into two:
1. Massively parallel computing: Supports countless processes running in parallel, significantly improving computing power.
2. Minimize trust dependence: There is no need to trust any single node, and all calculation results can be reproduced and traced infinitely.
How does AO break the impasse: the dilemma of public chains led by Ethereum?
Regarding the two major difficulties faced by Ethereum, performance constraints and insufficient applications, I believe this is exactly where AO’s strengths lie, for the following reasons:
1. AO is designed based on the SCP paradigm, and computing and storage are separated. Therefore, in terms of performance, it is no longer comparable to Ethereums single-process one-time computing. AO can flexibly expand more computing resources according to demand, and the holographic storage of message logs in Arwearve enables AO to ensure consensus by reproducing calculation results. From a security perspective, it is not inferior to Ethereum and Bitcoin.
2. The parallel computing architecture based on message passing can eliminate the need for AO processes to compete for locks. In the development of Web2, it is not difficult to know that a high-performance service will try to avoid lock competition, because this is costly for an efficient service. Similarly, AO processes avoid lock competition through messages, which makes its scalability reach any scale.
3. AOs modular architecture. AOs modularity is reflected in the separation of CU, SU, and MU, which allows AO to use any virtual machine, sequencer, etc., which provides extremely convenient and low-cost migration and development of DApps on different chains. Combined with Arwearves efficient storage capabilities, DApps developed on it can achieve richer gameplay. For example, character maps can at least be easily implemented on AO.
4. The support of modular architecture enables Web3 to adapt to the policy requirements of different countries and regions. Although the core concept of Web3 is decentralization and deregulation, it is inevitable that different policies of different countries have a profound impact on the development and promotion of Web3. Flexible modular combinations can be adapted according to the policies of different regions, thus ensuring the robustness and sustainable development of Web3 applications to a certain extent.
Phần kết luận
The separation of computing and storage is a great idea and a systematic design based on first principles.
As a narrative direction similar to decentralized cloud service, it not only provides a good landing scenario, but also provides a broader imagination space for combining with AI.
In fact, only by truly understanding the basic needs of Web3 can we get rid of the difficulties and shackles brought about by path dependence.
The combination of SCP and AO provides a new idea: it inherits all the characteristics of SCP, and no longer deploys smart contracts on the chain. Instead, it stores tamper-proof and traceable data on the chain, achieving data credibility that can be verified by everyone.
Of course, there is no absolutely perfect path yet, and AO is still in its infancy. How to prevent Web3 from being over-financialized, create enough application scenarios, and bring more possibilities for the future is still a test paper on AOs road to success. As for whether AO can hand in a satisfactory answer, it still needs to wait for the test of the market and time.
The combination of SCP and AO is a development paradigm full of potential. Although its concept has not yet been widely recognized in the market, AO is expected to play an important role in the Web3 field in the future and even promote the further development of Web3.
This article was first published on PermaDAO
Original link: https://mp.weixin.qq.com/s/ r 5 bhvWVhoEdohbhTt_ 7 b 5 A
This article is sourced from the internet: Nearly 10,000 words of research report! From the first principles, how SCP and AO affect the world on the chain