Layout
About the author, Advisor @Moledao @Web3Geeks, prev Tech Lead @Bybit
Note: This article is the author’s personal opinion at this stage. Some thoughts may contain factual errors and biased opinions. It is only for communication purposes and I look forward to corrections from other students.
BTC proposed electronic cash, opening up the blockchain industry from 0 to 1
ETH proposed smart contracts, leading the blockchain industry from 1 to 100
ICP proposed Chainkey technology, which can drive the blockchain industry from 100 to 100,000,000
一、Introduction
On January 3, 2009, the first BTC block was mined, and the blockchain has been developing vigorously for 14 years since then.
Looking back over the past 14 years, the ingenuity and greatness of BTC, the emergence of Ethereum, the passionate crowdfunding of EOS, the fateful battle between PoS and PoW, the interconnection of thousands of chains on Polkdadot, one amazing technology after another, one wonderful story after another, have attracted countless people in the industry!
What is the current situation of the entire blockchain in 2023? The following are my thoughts. For details, please see the interpretation of the public chain situation in this article.
BTC stands firm with its legitimacy as the industry's leading electronic cash provider.
ETH is the industry leader with the programmability of smart contracts and the composability of L2 ecology.
Cosmos, Polkadot, etc. try to dominate the world with cross-chain interoperability
Various Ethereum killers emerge in endlessly, each leading the way in its own niche
But how will the entire blockchain industry develop in the next 10 years? Here are my thoughts
Sovereignty is the only issue that blockchain needs to solve, including asset sovereignty, data sovereignty, speech sovereignty, etc. Otherwise, there is no need for blockchain.
Immutability is a sufficient condition, but not a necessary condition. As long as you can ensure that my sovereignty is not damaged, you can tamper with it at will. If the assets of everyone in the world are tampered with and doubled in the same proportion, what difference does it make?
Complete decentralization is impossible to achieve. No matter how it is designed, there will always be people with "talents" or vested interests who have greater say, and there will always be people who will actively choose not to participate. [Decentralized multi-point centralization] is the final pattern;
Transparency is a must. Isn't this social experiment for all mankind to let everyone have a say and have the right to protect their sovereignty? Although there are always lazy people, there are always people who are willing to trust more professional people, and there are always people who choose to give up voting for the sake of maximizing efficiency, but this is also a choice they make on their own initiative. They have the right but choose not to exercise it. As long as everything is transparent and there is no black box operation, I am willing to accept it even if I die clearly. If I lose, it is because I am not as good as others. The survival of the fittest is also in line with the market economy;
The core is the control of decentralized code execution. Otherwise, it is like taking off your pants and farting. After a week of voting, the project party still deployed the malicious version of the code. Even if it is not a malicious version, it is still teasing everyone. It can be said that half of the world is now made up of code. Decentralized entities do not include the control of code execution. How can people, including the government, dare to let the blockchain industry grow?
Infinite scalability with linear costs. As blockchain becomes more and more closely integrated with real life, more and more people are involved and the demand is growing. It is unacceptable if the infrastructure cannot support infinite scalability or the expansion is too expensive.
II. Why ICP
Let me first introduce a story here. In 2009, Alibaba proposed the "de-IOE" strategy, which was also a major milestone in the later success of Alibaba's "Double Eleven".
Former IOE
The core content of the "de-IOE" strategy is to remove IBM minicomputers, Oracle databases and EMC storage devices, and implant the essence of "cloud computing" into Alibaba's IT genes.
I refers to IBM p series minicomputers, whose operating system is AIX (IBM's proprietary Unix system);
O refers to Oracle database (RDBMS);
E refers to EMC's mid-to-high-end SAN storage.

There are three main reasons for going IOE, but the first one is the essential reason, and the latter two are more indirect:
Unable to meet demand. Traditional IOE systems are difficult to adapt to the high concurrency requirements of Internet companies and cannot support large-scale distributed computing architectures.
The cost is too high. The cost of maintaining IOE is too high. For example, an IBM minicomputer costs 500,000 yuan, while Oracle's annual warranty costs hundreds of thousands of yuan.
Too much dependence: The IOE system is too dependent and has been "kidnapped" by manufacturers such as IBM and Oracle, making it difficult to flexibly configure it according to its own needs.
Then why was the “de-IOE” strategy proposed in 2009 and not earlier?
Prior to this,
Alibaba's business scale and data volume have not yet reached a level that makes traditional IOE systems difficult to adapt to, so there is no urgent need to get rid of IOE;
Domestic database products are not mature enough in terms of technology and quality, and cannot replace the role of IOE;
Internet thinking and cloud computing concepts have not yet been popularized in China, and distributed architecture has not become a popular trend;
It may take some time for management and technical staff to accumulate experience and practice before they realize the existing problems and the measures that must be taken.
Year 2009,
Alibaba is rapidly expanding its business, and the IOE system is unable to support the scale, and cost issues are more likely to emerge;
Some open source database products, such as MySQL, are more mature and can be used as alternatives;
Internet thinking and cloud computing have begun to spread and be widely used in China, making it easier to promote the concept of "de-IOE";
Wang Jian, a former Microsoft technical expert, joined Alibaba in 2008 with a global technical perspective. He was deeply trusted by Jack Ma and proposed the "de-IOE" strategy.
However, "de-IOE" is not simply changing the software and hardware itself, replacing the old software and hardware with new ones, but replacing the old ways with new ones, and completely changing the IT infrastructure with cloud computing. In other words, this is caused by changes in the industry, not just a simple technology upgrade.
3. Three major stages of enterprise development
The development of an enterprise can be divided into three stages:
Shaping genes, organizational culture, start-up, from 0 to 1
Grow fast, take small steps, scale up, from 1 to 100
Scale-out from 100 to 100,000,000
Let’s analyze the entire blockchain industry as a company.
Start-up / Blockchain 1.0 / BTC
Bitcoin’s innovation was that it solved a problem that had stumped computer scientists for decades: how to create a digital payment system that could operate without trusting any central authority.
However, BTC does have some limitations in its design and development, which provide market opportunities for subsequent blockchain projects such as Ethereum (ETH). Here are some of the main limitations:
Transaction throughput and speed: Bitcoin’s block generation time is about 10 minutes, and the size limit of each block leads to an upper limit on its transaction processing capacity. This means that when the network is busy, transaction confirmation may take a long time and may require higher transaction fees.
Limited smart contract functionality: Bitcoin was designed primarily as a digital currency, and the transaction types and scripting language capabilities it supports are relatively limited. This limits Bitcoin’s use in complex financial transactions and decentralized applications (DApps).
Difficult to upgrade and improve: Due to Bitcoin’s decentralized and conservative design principles, major upgrades and improvements usually require broad community consensus, which is difficult to achieve in practice, making Bitcoin’s progress relatively slow.
Energy consumption: Bitcoin's consensus mechanism is based on proof of work (PoW), which means that a large amount of computing resources are used for competition between miners, resulting in a large amount of energy consumption. This has been criticized in terms of environmental protection and sustainability. In this regard, you can also pay attention to EcoPoW, which can be regarded as partially alleviating this limitation.
Scale-up / Blockchain 2.0 / ETH
The current expansion of Ethereum's Layer 2 can be seen as a kind of "vertical expansion", which relies on the security and data availability of the underlying Layer 1. Although it looks like a two-layer structure, it will eventually be limited by the processing power of Layer 1. Even if it is replaced with a multi-layer structure, that is, creating Layer 3 and Layer 4, it will only increase the complexity of the entire system and delay a little time. Moreover, according to the marginal diminishing effect, the additional overhead of each additional layer will greatly reduce the expansion effect. This vertical layering method can be seen as a single-machine hardware upgrade, except that this single machine refers to the entire ETH ecosystem.
As usage increases, users’ demands for low fees and high performance will also increase. As an application on Layer 1, Layer 2 can only reduce its fees to a certain extent, and is ultimately still subject to the basic fees and throughput of Layer 1. This is similar to the demand curve theory in economics - as prices fall, total demand increases. Vertical expansion is unlikely to fundamentally solve the scalability problem.
Ethereum is a towering tree. All people rely on its roots. Once the roots cannot absorb nutrients fast enough, people’s needs will not be met.
Therefore, only horizontal expansion is easier to achieve infinity.
Some people believe that multi-chain and cross-chain can also be regarded as a form of horizontal expansion.
Take Polkadot as an example. It is a heterogeneous kingdom. Each country looks different, but each time you make something, you have to build a kingdom.
Cosmos is an isomorphic kingdom. The meridians and bones of each country look the same, but each time a thing is made, a kingdom must be established;
But from the perspective of Infra, the above two models are a bit strange. Do we need to build an entire kingdom for each additional application? Let's take an example to see how strange it is.
I bought a Mac 3 months ago and developed a Gmail app on it;
Now I want to develop a Youtube app, but I have to buy a new Mac to develop it, which is weird.
Moreover, both of the above methods face the problem of high complexity of cross-chain communication when adding new chains, so they are not my first choice.
Scale-out / Blockchain 3.0 / ICP
If you want to scale-out, you need a complete set of underlying infrastructure that supports rapid horizontal expansion without reinventing the wheel.
A typical example of scale-out is cloud computing. The underlying templates [VPC+subnet+network ACL+security group] are exactly the same for everyone. All machines have numbers and types. The upper-layer core components such as RDS and MQ support unlimited expansion. If more resources are needed, you can quickly start them by clicking a button.
A leader once shared with me that if you want to know what infrastructure and components an Internet company needs, you only need to go to AWS and look at all the services they provide. That is the most complete and powerful combination.
Similarly, let's take a high-level look at ICP and see why it meets the scale-out requirements.
Here are a few concepts:
Dfinity Foundation: is a non-profit organization dedicated to promoting the development and application of decentralized computer technology. It is the developer and maintainer of the Internet Computer protocol, aiming to achieve the comprehensive development of decentralized applications through innovative technologies and an open ecosystem.
Internet Computer (IC): is a high-speed blockchain network developed by Dfinity Foundation, designed specifically for decentralized applications. It uses a new consensus algorithm that can achieve high throughput and low latency transaction processing, while supporting the development and deployment of smart contracts and decentralized applications.
Internet Computer Protocol (ICP): is the native token in the Internet Computer protocol. It is a digital currency used to pay for network usage and reward nodes.
四、What’s ICP
A lot of the following content will be a bit hardcore, but I have described it in plain language, so I hope everyone can follow it. If you want to discuss more details with me, you can find my contact information at the top of the article.
Architecture Overview
From the hierarchical structure, from bottom to top, they are
The P2P layer collects and sends messages from users, other replicas in the subnet, and other subnets. It ensures that messages can be delivered to all nodes in the subnet to ensure security, reliability, and resilience.
Consensus layer: The main task is to sort the input to ensure that all nodes within the same subnet process tasks in the same order. To achieve this goal, the consensus layer uses a new consensus protocol that is designed to guarantee security and activity and has the ability to resist DOS/SPAM attacks. After reaching a consensus on the order of processing various messages within the same subnet, these blocks will be passed to the message routing layer.
Message Routing Layer: Prepares the input queues of each Canister according to the tasks sent by the consensus layer. After execution, it is also responsible for receiving the output generated by the Canister and forwarding it to the local or other Canisters as needed. In addition, it is also responsible for recording and verifying the responses to user requests.
The execution layer provides a runtime environment for Canister, reads inputs in order according to the scheduling mechanism, calls the corresponding Canister to complete the task, and returns the updated status and generated output to the message routing layer. It uses the non-determinism brought by random numbers to ensure the fairness and auditability of calculations. Because in some cases, the behavior of Canister needs to be unpredictable. For example, when performing encryption operations, random numbers are needed to increase the security of encryption. In addition, the execution results of Canister need to be random to prevent attackers from finding vulnerabilities or predicting the behavior of Canister by analyzing the execution results of Canister.

4-layers of ICP
Key Components

In terms of composition,
Subnet: supports unlimited expansion, and each subnet is a small blockchain. Subnets communicate with each other through Chain Key technology. Since consensus has been reached within the subnet, it only needs to be verified by Chain Key.
Replica: Each Subnet can have many nodes, and each node is a Replica. The IC consensus mechanism ensures that each Replica in the same Subnet processes the same input in the same order, so that the final state of each Replica is the same. This mechanism is called Replicated State Machine.
Canister: Canister is a smart contract, a computing unit running on the ICP network that can store data and code and communicate with other Canisters or external users. ICP provides a runtime environment for executing Wasm programs in Canister and communicating with other Canisters and external users through message passing. You can simply think of it as a docker for running code, and then you inject the Wasm Code Image into it and run it.
Node: An independent server. Canister still needs a physical machine to run. These physical machines are the machines in the real computer room.
Data Center: The nodes in the data center are virtualized into a replica through the node software IC-OS, and some replicas are randomly selected from multiple data centers to form a subnet. This ensures that even if a data center is hacked or encounters a natural disaster, the entire ICP network can still operate normally, which is a bit like an upgraded version of Alibaba's "two locations and three centers" disaster recovery and high availability solution. Data centers can be distributed all over the world, and even a data center can be built on Mars in the future.
Boundary Nodes: Provide entry and exit between the external network and the IC subnet, and verify the response.
Principal: An identifier of an external user, derived from a public key, used for permission control.
Network Neural System (NNS): An algorithmic DAO that uses staked ICP for governance and is used to manage ICs.
Registry: A database maintained by NNS that contains mappings between entities (such as Replica, canister, Subnet), which is somewhat similar to the working principle of the current DNS.
Cycles: Local tokens, representing the CPU quota used to pay for the resources consumed by canister runtime. If I have to express it in Chinese, I would use the word "computing cycle" because cycles mainly refers to the unit used to pay for computing resources.

Key Innovative Technologies

From the bottom layer, Chain-key technology is used,
Threshold BLS signatures: ICP implements a threshold signature scheme. For each Subnet, there is a public and verifiable public key, and its corresponding private key is split into multiple shares. Each share is held by a Replica in this Subnet. Only when more than the threshold number of Replicas in the same Subnet sign the message is it considered valid. In this way, the messages transmitted between Subnets and Replicas are encrypted but can be quickly verified, which ensures both privacy and security. Among them, the BLS algorithm is a well-known threshold signature algorithm. It is the only signature scheme that can produce a very simple and efficient threshold signature protocol, and the signature is unique, which means that for a given public key and message, there is only one valid signature.
Non-interactive Distributed Key Generation (NIDKG): To safely deploy threshold signature schemes, Dfinity designed, analyzed, and implemented a new DKG protocol that runs on asynchronous networks and is highly robust (it can succeed even if up to one-third of the nodes in a subnet crash or become corrupted) while still providing acceptable performance. In addition to generating new keys, this protocol can also be used to re-share existing keys. This feature is critical to enabling autonomous evolution of IC topologies as subnets change membership over time.
Publicly Verifiable Secret Sharing scheme (PVSS Scheme): Publicly verifiable secret sharing scheme. In the white paper of Internet Computer protocol, PVSS scheme is used to implement decentralized key generation (DKG) protocol to ensure that the private key of the node will not be leaked during the generation process.
Forward-secure public-key encryption scheme: A forward-secure public-key encryption scheme ensures that even if the private key is leaked, previous messages cannot be decrypted, thereby improving the security of the system.
Key resharing protocol: A threshold signature-based key sharing scheme used to implement key management in the Internet Computer protocol. The main advantage of this protocol is that it can share existing keys with new nodes without creating new keys, thereby reducing the complexity of key management. In addition, the protocol also uses threshold signatures to protect the security of key sharing, thereby improving the security and fault tolerance of the system.
PoUW: PoUW has an extra U compared to PoW, which stands for Userful. It mainly improves performance and reduces unnecessary work for node machines. PoUW does not artificially create difficult hash calculations. It puts computing power as much as possible on serving users. Most resources (CPU, memory) are used for the actual execution of code in canister.
Chain-evolution technology: is a technology used to maintain the blockchain state machine. It includes a series of technical means to ensure the security and reliability of the blockchain. In the Internet Computer protocol, Chain-evolution technology mainly includes the following two core technologies:
Summary blocks: The first block of each epoch is a summary block, which contains some special data for managing different threshold signature schemes. Among them, a low threshold scheme is used to generate random numbers, and a high threshold scheme is used to authenticate the replication status of the subnet.
Catch-up packages (CUPs): CUPs is a technology for quickly synchronizing node status. It allows newly joined nodes to quickly obtain the current status without re-running the consensus protocol.

My logical deduction of the entire IC underlying technology is:
In traditional public key cryptography, each node has its own public-private key pair, which means that if a node's private key is leaked or attacked, the security of the entire system will be threatened. The threshold signature scheme divides a key into multiple parts and assigns them to different nodes. Only when a sufficient number of nodes cooperate can a signature be generated, so that even if some nodes are attacked or leaked, it will not have much impact on the security of the entire system. In addition, the threshold signature scheme can also improve the decentralization of the system, because it does not require a centralized organization to manage the key, but instead distributes the key to multiple nodes, which can avoid single point failure and centralization risks. Therefore, IC uses the threshold signature scheme to improve the security and decentralization of the system, hoping to use the threshold signature method to complete a universal blockchain that is highly secure, scalable, and can be quickly verified.
BLS is a well-known threshold signature algorithm. It is the only signature scheme that can produce a very simple and efficient threshold signature protocol. Another advantage of BLS signature is that it does not need to save the signature state. As long as the message content remains unchanged, the signature is fixed, which means that for a given public key and message, there is only one valid signature. This ensures extremely high scalability, so ICP chose the BLS scheme.
Because threshold signatures are used, a distributor is required to distribute key fragments to different participants. However, the person who distributes the key fragments is a single point, which can easily lead to single point failure problems. Therefore, Dfinity has designed a distributed key distribution technology, namely NIDKG. During the initialization period of subnet creation, all participating Replicas jointly and non-interactively generate a public key A. For the corresponding private key B, each participant mathematically calculates and holds one of the derived secret shares.
In order to do NIDKG, it is necessary to ensure that every distributed participant does not commit fraud. Therefore, each participant can not only obtain his or her own secret share, but also publicly allow others to verify whether his or her secret share is correct. This is a very important point in realizing distributed key generation.
What if the subnet key at a certain historical moment is leaked? How to ensure the immutability of historical data? Dfinity uses a forward-secure signature scheme, which ensures that even if the subnet key at a certain historical moment is leaked, the attacker cannot change the data of the historical block, which also prevents the threat of later corruption attacks to the historical data of the blockchain. If this restriction is stronger, it can actually ensure that the information will not be successfully eavesdropped during transmission, because the timestamps do not match, and even if the key is cracked in a short period of time, the past communication content cannot be cracked.
With NIDKG, if a certain secret share is held by a node for a long time, once each node is gradually eroded by hackers, the entire network may have problems. Therefore, it is necessary to constantly update the key, but the key update cannot require all participating Replicas to gather together for interactive communication, but must also be carried out non-interactively. However, because public key A has been registered in NNS, other subnets will also use this public key A for verification, so it is best not to change the subnet public key. But if the subnet public key does not change, how to update the secret share between nodes? Therefore, Dfinity has designed a Key resharing protocol. Without creating a new public key, all Replicas holding the current version of the secret share non-interactively generate a new round of derivative secret shares for the holders of the new version of the secret share, so that
This ensures that the new version of the secret share is authenticated by all current legal secret share holders.
It also ensures that old versions of secret shares are no longer valid.
It also ensures that even if the new version of the secret share is leaked in the future, the old version of the secret share will not be leaked, because the polynomials between the two are completely unrelated and cannot be reversed. This is also the forward security just introduced above.
In addition, efficient random redistribution is ensured. When trusted nodes or access control change, access policies and controllers can be modified at any time without restarting the system. This greatly simplifies the key management mechanism in many scenarios. This is useful, for example, in the case of subnet membership changes, as resharing will ensure that any new members have the appropriate secret share, and any replicas that are no longer members will no longer have a secret share. Furthermore, if a small number of secret shares are leaked to an attacker in any one epoch or even every epoch, these secret shares will not be of any benefit to the attacker.
Because traditional blockchain protocols need to store all block information starting from the genesis block, as the blockchain grows, this will lead to scalability issues, which is why it is very troublesome for many public chains to develop a light client. So IC wants to solve this problem, so IC developed Chain-evolution Technology. At the end of each epoch, all processed inputs and consensus information can be safely cleared from the memory of each Replica, which greatly reduces the storage requirements of each Replica. It enables IC to scale to support a large number of users and applications. In addition, Chain-evolution technology also includes CUPs technology, which allows newly added nodes to quickly obtain the current status without re-running the consensus protocol, which greatly reduces the threshold and synchronization time for new nodes to join the IC network.
In summary, all the underlying technologies of IC are closely linked, based on cryptography (from theory), and fully consider the challenges of the entire industry such as fast synchronization of nodes (from practice). It is truly a culmination of all!
Key Features
In terms of characteristics
Reverse Gas Model: Most traditional blockchain systems require users to hold native tokens, such as ETH, BTC, and then consume native tokens to pay transaction fees. This increases the entry barrier for new users and does not conform to people's usage habits. Why do I have to hold Tiktok shares before I can use Tiktok? ICP uses a reverse Gas model design. Users can directly use the ICP network, and the project party will be responsible for the handling fees. This lowers the threshold for use and is more in line with the habits of Internet services. It is conducive to obtaining a larger-scale network effect, thereby supporting more users to join.

Stable Gas: For other public chains on the market, for the security of the chain and for the needs of transfer, some people will buy native tokens, miners will work hard to mine, or some people will work hard to hoard native tokens, which will contribute computing power to this public chain, such as Bitcoin, or provide pledge economic security for this public chain, such as Ethereum. It can be said that our demand for btc/eth actually comes from the requirements of Bitcoin/Ethereum public chains for computing power/pledge, which is essentially the security requirement of the chain. Therefore, as long as the chain directly uses native tokens to pay for gas, it will still be expensive in the future. Maybe the native tokens are cheap now, but as long as the chain itself is ecological, it will become expensive later. ICP is different. The Gas consumed in the ICP blockchain is called Cycles, which is exchanged by consuming ICP. Cycles is stable under algorithm regulation and anchored to 1 SDR (SDR can be regarded as a stable unit calculated by combining multiple legal currencies). Therefore, no matter how much ICP rises in the future, the money you spend on doing anything in ICP is the same as today (not considering inflation).

Wasm: Using WebAssembly (Wasm) as the standard for code execution, developers can use a variety of popular programming languages (such as Rust, Java, C++, Motoko, etc.) to write code, thereby supporting more developers to join.
Support for running AI models: Python can also be compiled into wasm. Python has the largest number of users in the world and is also the first language for AI, such as matrix and large integer calculations. Some people have already run the Llama2 model on IC. I would not be surprised if the concept of AI + Web3 in the future happens on ICP.
Web2 speed experience: Currently, many applications on ICP have achieved amazing results of millisecond-level query and second-level update. If you don’t believe it, you can directly use OpenChat, a purely decentralized chat application on the chain.
Running the front-end on the chain: You have only heard that part of the back-end content is written into a simple smart contract and then put on the chain to run, so as to ensure that the core logic such as data assets cannot be tampered with. But the front-end actually needs to run completely on the chain to be safe, because front-end attacks are very typical and frequent problems. Imagine that everyone may think that the Uniswap code is very safe, the smart contract has been verified by so many people for so many years, and the code is simple, so there will definitely be no problems. But suddenly one day if the front-end of Uniswap is hijacked, the contract you interact with is actually a malicious contract deployed by hackers, and you may lose everything in an instant. But if you store and deploy all the front-end code in IC's Canister, at least through the consensus security of IC, the front-end code cannot be tampered with by hackers, this protection is more complete, and the front-end can be run and rendered directly on IC, which does not affect the normal operation of the application. On IC, developers can build applications directly without traditional cloud services, databases or payment interfaces, and there is no need to buy a front-end server or worry about databases, load balancing, network distribution, firewalls and other issues. Users can directly access the front-end web pages deployed on ICP through browsers or mobile apps, such as a personal blog I deployed before.
DAO control code upgrade: In many DeFi protocols now, the project party has full control and can make major decisions at will, such as suspending operations and selling funds, without any community voting or consultation. I believe everyone has witnessed or heard of this case. In contrast, the DAPP code under the ICP ecosystem runs in a container controlled by the DAO. Even if a project party accounts for a large proportion of the vote, the public voting process is implemented, which meets the necessary conditions for blockchain transparency described at the beginning of this article. This process guarantee mechanism can better reflect the will of the community and has a better degree of governance than other current public chain projects.
Automatic protocol upgrade: When the protocol needs to be upgraded, a new threshold signature scheme can be added to the summary block to achieve automatic protocol upgrade. This approach ensures the security and reliability of the network while avoiding the inconvenience and risks caused by hard forks. Specifically, the Chain Key technology in ICP ensures the security and reliability of the network. It maintains the blockchain state machine through a special signature scheme. At the beginning of each epoch, the network uses a low-threshold signature scheme to generate a random number, and then uses a high-threshold signature scheme to authenticate the replication status of the subnet. This signature scheme ensures the security and reliability of the network while also achieving automatic protocol upgrades, thus avoiding the inconvenience and risks caused by hard forks.

Proposal Voting
Fast forwarding: It is a technology in the Internet Computer protocol that quickly synchronizes node status. It allows newly added nodes to quickly obtain the current status without re-running the consensus protocol. Specifically, the process of Fast forwarding is as follows:
The newly joined node obtains the Catch-up package (CUP) of the current epoch, which contains the Merkle tree root, summary block, and random number of the current epoch.
The newly joined node uses the state sync subprotocol to obtain the complete state of the current epoch from other nodes, and uses the Merkle tree root in CUP to verify the correctness of the state.
Newly joined nodes use the random numbers in CUP and protocol messages from other nodes to run the consensus protocol, thereby quickly synchronizing to the current state.
The advantage of fast forwarding is that it allows newly added nodes to quickly obtain the current status, without having to start from scratch like some other public chains. This can speed up the synchronization and expansion of the network, and at the same time, it can also reduce the amount of communication between nodes, thereby improving the efficiency and reliability of the network.

fast forwarding
Decentralized Internet Identity: The identity system on IC really makes me feel that the DID problem can be completely solved, and completely solved, whether it is scalability or privacy. The identity system on IC currently has an implementation version called Internet Identity, and there is also a more powerful NFID developed based on it. Its principles are as follows:
When registering, it generates a pair of public and private keys for the user. The private key is stored in the TPM security chip in the user's device and will never be leaked, while the public key will be shared with services on the network.
When a user logs into a dapp, the dapp creates a temporary session key for the user. This session key is signed by the user through an authorized electronic signature, so that the dapp has the authority to verify the user's identity.
After signing the session key, the dapp can use the key to access network services on behalf of the user without the user having to sign electronically every time. This is similar to the representative authorization login in Web2.
The session key is only valid for a short period of time. After expiration, the user needs to re-authorize the signature through biometric recognition to obtain a new session key.
The user's private key is always stored in the local TPM security chip and will not leave the device. This ensures the security of the private key and the anonymity of the user.
By using temporary session keys, different dapps cannot track each other's user identities, achieving true anonymity and private access.
Users can easily manage their Internet Identity across multiple devices, but the devices themselves also require corresponding biometrics or hardware keys for authorization.

Some of the advantages of Internet Identity are as follows:
No need to remember passwords. Use biometrics features such as fingerprint recognition to log in directly, without having to set and remember complex passwords.
The private key does not leave the device, which is more secure. The private key is stored in the TPM security chip and cannot be stolen, solving the problem of username and password theft in Web2.
Anonymous login, cannot be tracked. Unlike Web2, which uses email addresses as usernames and can be tracked across platforms, Internet Identity eliminates this tracking.
Multi-device management is more convenient. You can log in to the same account on any device that supports biometrics, rather than being limited to a single device.
It does not rely on central service providers and achieves true decentralization. This is different from the model in Web2 where usernames correspond to email service providers.
The delegated authentication process is adopted, and there is no need to sign again each time you log in, which provides a better user experience.
Supports logging in with dedicated security devices such as Ledger or Yubikey, which improves security.
The user's actual public key is hidden, and transaction records cannot be queried through the public key, thus protecting user privacy.
Seamlessly compatible with Web3 blockchain, it can log in and sign blockchain DApps or transactions securely and efficiently.
The architecture is more advanced, representing the organic integration of the advantages of Web2 and Web3, and is the standard for future network accounts and logins.
In addition to providing a new user experience, the following technical measures are also taken to ensure its security:
The private key is stored using a TPM security chip, which is designed so that even developers cannot access or extract the private key, preventing the private key from being stolen.
Secondary authentication mechanisms such as biometric authentication, such as fingerprint or facial recognition, need to be verified in conjunction with the device so that only the user holding the device can use that identity.
The session key is designed to expire within a short period of time, limiting the time window for theft and forcing the destruction of related ciphertexts at the end of the session to reduce risks.
Public key encryption technology encrypts data during transmission, making it impossible for external eavesdroppers to obtain user private information.
No reliance on third-party identity providers, PRIVATE KEY is generated and controlled by the user, and no third party is trusted.
Combined with the immutability brought by the IC blockchain consensus mechanism, the reliability of the entire system operation is ensured.
Relevant cryptographic algorithms and security processes are being continuously updated and upgraded, such as adding more secure mechanisms such as multi-signatures.
Open source code and decentralized design optimize transparency and facilitate community collaboration to improve security.

Internet Identity
5. Core Team
From the perspective of the team, there are more than 200 employees, all of whom are elite talents. The employees have published more than 1,600 papers, which have been cited more than 100,000 times, and hold more than 250 patents.
Academically, his recent mathematical theories include Threshold Relay and PSC Chain, Validation Towers and Trees and USCID.
From a technical background perspective, he has a deep background in technical research and development, and has been engaged in research in the fields of big data and distributed computing in his early years, which laid the technical foundation for building complex ICP networks.
From an entrepreneurial perspective, he previously operated an MMO game using his own distributed system, which hosted millions of users. In 2015, Dominic started Dfinity, and he is also the president and CTO of String labs.
From a vision perspective, he proposed the concept of a decentralized Internet more than 10 years ago. It is not an easy task to promote this grand project in the long term. At present, his design ideas are very forward-looking.
Founder Dominic Williams is a crypto theorist and serial entrepreneur.
Dfinity has a very strong technical team. The Dfinity Foundation has brought together a large number of top cryptography and distributed system experts, such as Jan Camenisch, Timothy Roscoe, Andreas Rossberg, Maria D., Victor Shoup, etc. Even Ben Lynn, the "L" in the BLS cryptographic algorithm, also works at Dfinity. This provides strong support for ICP's technological innovation. The success of blockchain projects is inseparable from technology, and the gathering of top talents can bring about technological breakthroughs, which is also a key advantage of ICP.

Dfinity Foundation Team
6. Fund-raising & Tokenomics
If I were to discuss this topic, this article would be too long, so I decided to write a separate article later to analyze it in detail. This article focuses more on the development direction of the blockchain industry and why ICP has great opportunities.

7. Applications
All types of applications, social platforms, creator platforms, chat tools, games, and even metaverse games can be developed on ICP.
Many people say that IC is not suitable for DeFi because it is difficult to achieve global state consistency, but I think this question itself is wrong. It is not that global state consistency is difficult to achieve, but global state consistency under low latency is difficult to achieve. If you can accept 1 minute, 10,000 machines around the world can also achieve global consistency. Ethereum and BTC now have so many nodes, haven't they been forced to achieve global state consistency under high latency? Therefore, they are currently unable to achieve horizontal unlimited expansion. IC first solves the problem of horizontal unlimited expansion by dividing the subnet. As for global state consistency under low latency, it can also be achieved through a strongly consistent distributed consistency algorithm, a well-designed network topology, high-performance distributed data synchronization, effective timestamp verification, and a mature fault-tolerant mechanism. But to be honest, it is more difficult to make a trading platform at the IC application level and the high-performance trading platform made by the current group of people on Wall Street. It is not just about reaching consensus in multiple computer rooms. However, the difficulty does not mean that it cannot be done at all, but that many technical problems must be solved first, and eventually a moderate state will be found, which not only guarantees security, but also guarantees an acceptable experience for people. For example, the ICLightHouse below.
ICLightHouse, an orderbook dex on the whole chain, what is the concept of the whole chain? How many technical difficulties need to be solved? On other public chains, this is unthinkable, but on IC, at least it’s doable, which gives us hope.

OpenChat is a decentralized chat application with a great experience. I have not seen a second product like it in the entire blockchain industry. Many other teams have tried this before, but they all failed due to various technical problems. In the final analysis, users feel that the experience is not good. For example, the speed is too slow. It takes 10 seconds to send a message and 10 seconds to receive other people's messages. However, a small team of three people on ICP has made such a successful product. You can experience how smooth it is for yourself. Welcome to join the organization, where you can enjoy the collision of ideas and the freedom of speech to a certain extent.

Mora is a platform for super creators, where everyone can create a planet and build their own brand. The content you output will always belong to you, and you can even support paid reading. It can be called a decentralized knowledge planet. I have to refresh the articles on it every day.

Easy - 0xkookoo
OpenChat and Mora apps are products that I use almost every day. They give me a sense of comfort that I can’t live without. Two words to describe them are freedom and fulfillment.
There are already some teams developing game applications on IC. I think the narrative of full-chain games may eventually be taken over by IC. As I said in the GameFi section of this article I wrote before, the playability and fun of the game are things that the project party needs to consider. Playability is easier to achieve on IC. I look forward to Dragginz's masterpiece.

8. Summary
ICP is like the earth, and Chainkey technology is like the core of the earth. Its relationship with ICP is similar to the relationship between TCP/IP protocol and the entire Internet industry. Each Subnet is like the Asian, African and Latin American continents. Of course, the Subnet can also be the Pacific/Atlantic Oceans. There are different buildings and areas (Replica and Node) in the continents and oceans. Plants (Canister) can be planted on each area and building, and different animals live happily there.
ICP supports horizontal expansion. Each subnet is autonomous and can communicate with each other. No matter what application you are using, social media, finance, or even the metaverse, you can achieve final consistency through this distributed network. It is easy to achieve a global ledger under synchronous conditions, but it is very challenging to achieve "global state consistency" under asynchronous conditions. At present, only ICP has the opportunity to do this.
It should be noted that what is meant here is not "global state consistency", but "global state consistency". "Global state consistency" requires all participating nodes to [agree on the order of all operations], [consistent final results], [objective consistency, not depending on whether the node fails], [consistent clocks], [instant consistency, all operations are processed synchronously], which can be guaranteed in the IC single subnet. However, if you want to ensure "global state consistency", all subnets as a whole need to achieve the above "global state consistency" for the same data and state. In actual implementation, this is impossible to achieve within low latency, which is also the bottleneck that public chains such as ETH cannot expand horizontally. Therefore, IC chose to reach a consensus within a single subnet, and other subnets quickly verified that the results were not falsified through communication, so as to achieve "final global state consistency". It is equivalent to combining the decentralization of large public chains with the high throughput and low latency of alliance chains, and realizing the unlimited horizontal expansion of subnets through mathematical and cryptographic algorithm proofs.
In summary, we can see that according to the ultimate development direction of blockchain that I thought about at the beginning of the article, [sovereignty] + [decentralized multi-point centralization] + [transparency] + [control of code execution] + [infinite scalability of linear cost],
Sovereignty is the only issue that blockchain needs to solve, including asset sovereignty, data sovereignty, speech sovereignty, etc. Otherwise, there is no need for blockchain.
IC has done it completely
Immutability is a sufficient condition, but not a necessary condition. As long as you can ensure that my sovereignty is not damaged, you can tamper with it at will. If the assets of everyone in the world are tampered with and doubled in the same proportion, what difference does it make?
IC also did it
Complete decentralization is impossible to achieve. No matter how it is designed, there will always be people with "talents" or vested interests who have greater say, and there will always be people who will actively choose not to participate. [Decentralized multi-point centralization] is the final pattern;
IC is the best among all the public chains currently. It can maintain a certain degree of decentralization while making full use of the advantages of centralized entities to better realize the governance and operation of the network.
Transparency is a must. Isn't this social experiment for all mankind to let everyone have a say and have the right to protect their sovereignty? Although there are always lazy people, there are always people who are willing to trust more professional people, and there are always people who choose to give up voting for the sake of maximizing efficiency, but this is also a choice they make on their own initiative. They have the right but choose not to exercise it. As long as everything is transparent and there is no black box operation, I am willing to accept it even if I die clearly. If I lose, it is because I am not as good as others. The survival of the fittest is also in line with the market economy;
IC has done it completely
The control over code execution is the core, otherwise it would be a waste of time. After a week of voting and public announcement, the project party still deployed the malicious version of the code. Even if it is not a malicious version, it is still a joke.
Currently only IC has done this
Infinite scalability with linear costs. As blockchain becomes more and more closely integrated with real life, more and more people are involved and the demand is growing. It is unacceptable if the infrastructure cannot support infinite scalability or the expansion is too expensive.
Currently only IC has done this
Based on the above facts and my thinking and analysis, I believe that ICP = Blockchain 3.0.
This article is just to discuss why ICP is likely to be the innovation driver of blockchain 3.0 from the perspective of the future development of the blockchain industry. However, it is undeniable that there are indeed some problems in the design of ICP's Tokenomics, and the ecosystem has not exploded for the time being. At present, ICP still needs to work hard to achieve the ultimate blockchain 3.0 in my mind. But don't worry, this matter is difficult. Even the Dfinity Foundation has prepared a 20-year roadmap. It has achieved such a great achievement in just 2 years since the mainnet was launched. It has also used cryptography to connect with the BTC and ETH ecosystems. I believe it will be even better in 3 years.

9. Future
IC has completed the bottom-up Infra construction, and the top-down application has also begun to take shape. My recent direct impression is that IC has more and more cards to play, and is fully prepared for the next bull market.
IC is a paradigm update, not just a simple technology upgrade. It is a paradigm shift from stand-alone computing to distributed computing, and even more so from stand-alone systems to distributed systems. The concept of decentralized cloud computing allows many small companies to enjoy a one-stop development experience in the initial stage.
According to Mr. Yu Jun's product value formula: Product value = (new experience - old experience) - migration cost, in the future, as long as some people find that the experience benefits of joining the IC ecosystem are greater than the migration cost, more people, including project parties and users, will join the IC, and the scale effect of "cloud computing" will be more easily reflected. Solving the problem of "which came first, the chicken or the egg", the positive flywheel of IC will be established.
Of course, everyone's definition of experience is subjective, so there will always be people who choose to join first, while others choose to join later. Those who join first take greater risks, but they will usually get greater benefits on average.

IC content you care about
Technology Progress | Project Information | Global Activities

Collect and follow IC Binance Channel
Get the latest news