Binance Square

Daft Punk马到成功版

image
Verified Creator
文章均为个人看法,不构成投资建议。
High-Frequency Trader
1.1 Years
334 Following
47.5K+ Followers
39.6K+ Liked
5.9K+ Shared
Posts
PINNED
·
--
Brothers, I heard that industry veterans are once again using streaming to wash other people's top drafts. They boast about collaborations with this one and that one, but in the end, it turns out they are collaborating with Doubao to wash other people's drafts. The scores they flaunt were all washed out. Are you Blue Moon, so capable of washing? This is not named here, guess for yourself on the leaderboard 😂
Brothers, I heard that industry veterans are once again using streaming to wash other people's top drafts. They boast about collaborations with this one and that one, but in the end, it turns out they are collaborating with Doubao to wash other people's drafts. The scores they flaunt were all washed out. Are you Blue Moon, so capable of washing? This is not named here, guess for yourself on the leaderboard 😂
PINNED
The creative environment of the square is roughly like this: we ordinary creators are just providing material for the streaming players. I've heard you're still trying to create something original; don't be ridiculous, just wash up and go to sleep. Plagiarism is the only way out for the square; the originals have already starved. Streaming players get over ten thousand views in ten minutes, and if you don't stream, I wouldn't have noticed how outrageous it is. You don't even change the title, haha.
The creative environment of the square is roughly like this: we ordinary creators are just providing material for the streaming players. I've heard you're still trying to create something original; don't be ridiculous, just wash up and go to sleep. Plagiarism is the only way out for the square; the originals have already starved. Streaming players get over ten thousand views in ten minutes, and if you don't stream, I wouldn't have noticed how outrageous it is. You don't even change the title, haha.
In the past few days, I have interacted deeply with Fogo's mainnet, and the most intuitive feeling is that it has materialized what Solana has always wanted to achieve but hasn't fully realized: the "on-chain Nasdaq." Unlike applications like dYdX that are based on Cosmos, @fogo has directly embedded a central limit order book (CLOB) at the protocol layer of a general Layer 1. This means that when I place an order on ValiantTrade, I am calling a system-level matching engine rather than the local state of a specific smart contract. The "atomic composability" brought by this architectural design, resulting in a 40-millisecond block experience, indeed feels novel for liquidity providers accustomed to the impermanent loss of Uniswap's AMM. In high-frequency trading scenarios, the parallel execution capabilities of the Firedancer client are fully demonstrated, completely eliminating the anxiety of needing to pay exorbitant priority fees for packaging, as seen recently with Solana. However, behind the technical carnival, the cold start problem of liquidity is laid bare. Although the protocol layer supports CLOB, the current depth completely relies on the inventory of market makers. In my tests on several non-mainstream trading pairs, the bid-ask spread even reached as high as 2%, which is a fatal flaw for a chain that claims to offer a "CEX experience." In contrast, Monad is still relying on PPT and financing quotas to attract interest, while Fogo, despite launching ahead, has to face the awkwardness of having "roads without cars." Although Sui and Aptos also have strong performance, the high barrier of the Move language hinders ecological migration, while Fogo's direct compatibility with SVM allows Solana developers to copy-paste code at zero cost, making this "vampire" strategy extremely pragmatic. Another interesting technical point is Fogo Sessions. Implementing signature-free multiple transactions through account abstraction is simply a necessity in chain games and SocialFi. However, at the current stage, wallet adaptation is a complete disaster. Several times after sessions expired, the front end did not report any errors, causing transactions to remain pending; this roughness in engineering implementation is too pronounced. As for the centralization issues that everyone criticizes, the current high hardware threshold indeed turns verification nodes into a "big player club," but in the impossible triangle, Fogo has clearly chosen extreme performance and scalability. For investors seeking high beta returns, the degree of decentralization is often a secondary consideration. Although the current market value is only around 85 million dollars, it also reflects the market's wait-and-see attitude towards such a "strong technology, weak ecology" public chain. #fogo $FOGO
In the past few days, I have interacted deeply with Fogo's mainnet, and the most intuitive feeling is that it has materialized what Solana has always wanted to achieve but hasn't fully realized: the "on-chain Nasdaq." Unlike applications like dYdX that are based on Cosmos, @Fogo Official has directly embedded a central limit order book (CLOB) at the protocol layer of a general Layer 1. This means that when I place an order on ValiantTrade, I am calling a system-level matching engine rather than the local state of a specific smart contract. The "atomic composability" brought by this architectural design, resulting in a 40-millisecond block experience, indeed feels novel for liquidity providers accustomed to the impermanent loss of Uniswap's AMM. In high-frequency trading scenarios, the parallel execution capabilities of the Firedancer client are fully demonstrated, completely eliminating the anxiety of needing to pay exorbitant priority fees for packaging, as seen recently with Solana.
However, behind the technical carnival, the cold start problem of liquidity is laid bare. Although the protocol layer supports CLOB, the current depth completely relies on the inventory of market makers. In my tests on several non-mainstream trading pairs, the bid-ask spread even reached as high as 2%, which is a fatal flaw for a chain that claims to offer a "CEX experience." In contrast, Monad is still relying on PPT and financing quotas to attract interest, while Fogo, despite launching ahead, has to face the awkwardness of having "roads without cars." Although Sui and Aptos also have strong performance, the high barrier of the Move language hinders ecological migration, while Fogo's direct compatibility with SVM allows Solana developers to copy-paste code at zero cost, making this "vampire" strategy extremely pragmatic.
Another interesting technical point is Fogo Sessions. Implementing signature-free multiple transactions through account abstraction is simply a necessity in chain games and SocialFi. However, at the current stage, wallet adaptation is a complete disaster. Several times after sessions expired, the front end did not report any errors, causing transactions to remain pending; this roughness in engineering implementation is too pronounced. As for the centralization issues that everyone criticizes, the current high hardware threshold indeed turns verification nodes into a "big player club," but in the impossible triangle, Fogo has clearly chosen extreme performance and scalability. For investors seeking high beta returns, the degree of decentralization is often a secondary consideration. Although the current market value is only around 85 million dollars, it also reflects the market's wait-and-see attitude towards such a "strong technology, weak ecology" public chain. #fogo $FOGO
The Violent Aesthetics of Speed: I Saw the Endgame of Transactions on the Fogo ChainJust last week, I habitually raised the Priority Fee to 0.05 SOL on Solana to grab a土狗盘, but it still showed transaction failure. At that moment, I stared at the red error pop-up and even felt a bit like laughing. We always brag about the high-performance narrative of blockchain, but our bodies honestly endure congestion, slippage, and the onslaught of MEV bots. This sense of division is particularly strong in the mid-bull market. It was at this time that I truly shifted my attention back to Fogo, not for any so-called airdrop blessings, but purely to see how this guy, who claims to bring CEX experience onto the chain, actually performs. After a week of hands-on experience, my feelings are quite complex, combining that sense of 'finally waiting for you' with concerns about the centralization worries behind such extreme performance. But without a doubt, @fogo is rewriting the competition rules of Layer 1.

The Violent Aesthetics of Speed: I Saw the Endgame of Transactions on the Fogo Chain

Just last week, I habitually raised the Priority Fee to 0.05 SOL on Solana to grab a土狗盘, but it still showed transaction failure. At that moment, I stared at the red error pop-up and even felt a bit like laughing. We always brag about the high-performance narrative of blockchain, but our bodies honestly endure congestion, slippage, and the onslaught of MEV bots. This sense of division is particularly strong in the mid-bull market. It was at this time that I truly shifted my attention back to Fogo, not for any so-called airdrop blessings, but purely to see how this guy, who claims to bring CEX experience onto the chain, actually performs. After a week of hands-on experience, my feelings are quite complex, combining that sense of 'finally waiting for you' with concerns about the centralization worries behind such extreme performance. But without a doubt, @Fogo Official is rewriting the competition rules of Layer 1.
Don't be fooled by high TPS anymore; the real battleground of the AI chain lies in the semantic layer. These past few days, watching various L1 public chains' so-called technological breakthroughs on the screen, I have to say I'm a bit aesthetically fatigued. The current public chain track resembles the pixel wars of mobile phones from back in the day, where everyone is frantically competing on TPS and fee rates, but when you actually run the code, you'll find that speed is not the only pain point for AI Agents. Recently, I tested the testnet of @Vanar and ran its V23 architecture alongside a bunch of old public chains that claim to have "AI concepts". I discovered some very interesting detailed differences. Many so-called AI public chains are essentially just wrapping an API shell around EVM, like putting an iPad on a horse-drawn carriage; it looks smart, but when it runs, it still feels the same. However, when I studied Vanar's Neutron layer, I was indeed a bit amazed. It solved a problem that had been extremely troublesome for me when developing DApps: the "amnesia" of on-chain data. On traditional Solana or Ethereum, the data stored in smart contracts is just cold, hard bytes. For AI to understand this data, it has to drag the data off-chain for cleaning and inference, then write it back on-chain. This back and forth not only causes Gas fees to explode but also drives you crazy with delays. What makes Vanar's approach wild is that it incorporates this "understanding ability" into the underlying layer. When I deployed a simple trading intent recognition script, I found that by utilizing its semantic memory layer, the contract could actually read and retain context directly. This means that future AI Agents will no longer be foolish triggers that charge per use but will be long-term thinkers that can remember what happened the previous second. Currently, those projects on the market that still rely on cross-chain bridges to external AI models really can’t see the tail lights. Of course, I have to complain a bit; the current documentation is still a bit "hardcore" for developers. Many interface descriptions regarding the Kayon inference layer are buried too deep, making my scalp tingle while searching. Moreover, the wallet adaptation across the entire ecosystem is not yet smooth as silk, and occasionally there are low-level errors such as connection timeouts, probably because the node synchronization issues haven't been fully resolved yet. But thinking about it, this kind of infrastructure-level reconstruction is itself dirty and tiring work. #vanar $VANRY
Don't be fooled by high TPS anymore; the real battleground of the AI chain lies in the semantic layer.
These past few days, watching various L1 public chains' so-called technological breakthroughs on the screen, I have to say I'm a bit aesthetically fatigued. The current public chain track resembles the pixel wars of mobile phones from back in the day, where everyone is frantically competing on TPS and fee rates, but when you actually run the code, you'll find that speed is not the only pain point for AI Agents. Recently, I tested the testnet of @Vanarchain and ran its V23 architecture alongside a bunch of old public chains that claim to have "AI concepts". I discovered some very interesting detailed differences.
Many so-called AI public chains are essentially just wrapping an API shell around EVM, like putting an iPad on a horse-drawn carriage; it looks smart, but when it runs, it still feels the same. However, when I studied Vanar's Neutron layer, I was indeed a bit amazed. It solved a problem that had been extremely troublesome for me when developing DApps: the "amnesia" of on-chain data. On traditional Solana or Ethereum, the data stored in smart contracts is just cold, hard bytes. For AI to understand this data, it has to drag the data off-chain for cleaning and inference, then write it back on-chain. This back and forth not only causes Gas fees to explode but also drives you crazy with delays.
What makes Vanar's approach wild is that it incorporates this "understanding ability" into the underlying layer. When I deployed a simple trading intent recognition script, I found that by utilizing its semantic memory layer, the contract could actually read and retain context directly. This means that future AI Agents will no longer be foolish triggers that charge per use but will be long-term thinkers that can remember what happened the previous second. Currently, those projects on the market that still rely on cross-chain bridges to external AI models really can’t see the tail lights.
Of course, I have to complain a bit; the current documentation is still a bit "hardcore" for developers. Many interface descriptions regarding the Kayon inference layer are buried too deep, making my scalp tingle while searching. Moreover, the wallet adaptation across the entire ecosystem is not yet smooth as silk, and occasionally there are low-level errors such as connection timeouts, probably because the node synchronization issues haven't been fully resolved yet. But thinking about it, this kind of infrastructure-level reconstruction is itself dirty and tiring work.
#vanar $VANRY
Tearing Away the Shroud of AI DePIN: Why I'm Bearish on 99% of AI Public Chains, Yet Saw True Native Genes in VanarIn the past week, I hardly slept well, not because of market fluctuations, but because I was infuriated by those so-called AI concept projects on the market. Opening Twitter was filled with a uniform narrative of 'AI + Crypto,' as if simply connecting a large language model's interface to the blockchain and issuing a token could claim to be a decentralized intelligent network. This simplistic and crude patchwork logic is simply insulting to developers' intelligence. In this cycle filled with excitement and bubbles, everyone is busy making their stories sound appealing, yet few are willing to squat down and see if the underlying foundation is solid. Just when I was about to completely deny the AI track of this cycle, the technical white paper of @Vanar and the recent testnet data made me stop. It felt strange, like rummaging through a garbage heap for discarded parts, and suddenly touching a finely polished industrial-grade bearing. It’s not flashy, but that heavy, substantial feeling lets you know this is the real deal.

Tearing Away the Shroud of AI DePIN: Why I'm Bearish on 99% of AI Public Chains, Yet Saw True Native Genes in Vanar

In the past week, I hardly slept well, not because of market fluctuations, but because I was infuriated by those so-called AI concept projects on the market. Opening Twitter was filled with a uniform narrative of 'AI + Crypto,' as if simply connecting a large language model's interface to the blockchain and issuing a token could claim to be a decentralized intelligent network. This simplistic and crude patchwork logic is simply insulting to developers' intelligence. In this cycle filled with excitement and bubbles, everyone is busy making their stories sound appealing, yet few are willing to squat down and see if the underlying foundation is solid. Just when I was about to completely deny the AI track of this cycle, the technical white paper of @Vanarchain and the recent testnet data made me stop. It felt strange, like rummaging through a garbage heap for discarded parts, and suddenly touching a finely polished industrial-grade bearing. It’s not flashy, but that heavy, substantial feeling lets you know this is the real deal.
I really can't hold on, I've never seen a positive outlook, I've never lacked a negative outlook, periodic reports, critical reports.
I really can't hold on, I've never seen a positive outlook, I've never lacked a negative outlook, periodic reports, critical reports.
Wow, his brother, the contrast between Binance and Adasi is too much, haha.
Wow, his brother, the contrast between Binance and Adasi is too much, haha.
币圈阿智
·
--
February 2018 in Xinjiang Yili
After 8 months, the project party fu*ked the users haha, probably found this by searching for themselves.
After 8 months, the project party fu*ked the users haha, probably found this by searching for themselves.
Purely a clickbait title, an arthritis flare-up made me think I encountered a male gunman again.
Purely a clickbait title, an arthritis flare-up made me think I encountered a male gunman again.
Bingbing, is this the official announcement? What a coincidence that the director's birthday is on Valentine's Day. Is this photo pose really that of a partner?😭
Bingbing, is this the official announcement? What a coincidence that the director's birthday is on Valentine's Day. Is this photo pose really that of a partner?😭
In-depth analysis of the performance after the mainnet launch of @fogo , I found that it is not simply a Solana copy, but rather an aggressive release of the potential of the Firedancer client. The block time is compressed to 40 milliseconds, which nearly pushes the performance metrics to physical limits in the context of the CAP theorem of distributed systems. Compared to Solana's mainnet's 400 milliseconds, Fogo cleverly avoids the inherent latency issue of light speed through a multi-region consensus mechanism. This strategy of grouping verification nodes by geographical location essentially compromises some degree of decentralization purity locally in exchange for an astonishing 1.3 seconds transaction finality. When placing high-frequency orders on Valiant DEX, this architectural advantage translates into visibly improved matching efficiency, making the on-chain CLOB (Central Limit Order Book) no longer just a theoretical model, with the speed of order cancellations and placements even leaving users accustomed to CEX feeling dazed. However, the brutal aesthetics of technical parameters cannot completely mask the early ecological fragility. Although the theoretical full compatibility of SVM allows Solana projects to migrate seamlessly, under the siege of high-performance chains like Monad and Sui, Fogo's narrative of being a 'faster Solana' struggles to establish a lasting moat. What stands out instead is its underlying support for a Gasless experience, allowing users to interact without native tokens through account abstraction enabled by Paymaster, which differentiates it in the current Layer 1 competition. However, the metadata parsing in the browser still appears rough, and the tracking of complex DeFi protocols lacks sufficient transparency, which may be a result of the team's inertia in prioritizing backend over frontend. From an investment risk control perspective, the $210 million FDV is indeed at a 'ground price' in the current L1 race, but this is more a reasonable pricing of its ecological cold start risk. The current circulating supply of 3.77 billion may seem ample, but the data inflation generated by high-frequency trading will impose exponential demands on node hardware in the future, which is an invisible cost that cannot be ignored. If Fogo wants to break through, it must prove its resilience under extreme market conditions, rather than merely acting as a backup when Solana is congested. In the year 2026, with infrastructure surplus, creating a Layer 1 dedicated to trading is a risky move, but if it can stabilize this high-frequency trading territory, it may redefine the valuation logic of public chains. #fogo $FOGO
In-depth analysis of the performance after the mainnet launch of @Fogo Official , I found that it is not simply a Solana copy, but rather an aggressive release of the potential of the Firedancer client. The block time is compressed to 40 milliseconds, which nearly pushes the performance metrics to physical limits in the context of the CAP theorem of distributed systems. Compared to Solana's mainnet's 400 milliseconds, Fogo cleverly avoids the inherent latency issue of light speed through a multi-region consensus mechanism. This strategy of grouping verification nodes by geographical location essentially compromises some degree of decentralization purity locally in exchange for an astonishing 1.3 seconds transaction finality. When placing high-frequency orders on Valiant DEX, this architectural advantage translates into visibly improved matching efficiency, making the on-chain CLOB (Central Limit Order Book) no longer just a theoretical model, with the speed of order cancellations and placements even leaving users accustomed to CEX feeling dazed.
However, the brutal aesthetics of technical parameters cannot completely mask the early ecological fragility. Although the theoretical full compatibility of SVM allows Solana projects to migrate seamlessly, under the siege of high-performance chains like Monad and Sui, Fogo's narrative of being a 'faster Solana' struggles to establish a lasting moat. What stands out instead is its underlying support for a Gasless experience, allowing users to interact without native tokens through account abstraction enabled by Paymaster, which differentiates it in the current Layer 1 competition. However, the metadata parsing in the browser still appears rough, and the tracking of complex DeFi protocols lacks sufficient transparency, which may be a result of the team's inertia in prioritizing backend over frontend.
From an investment risk control perspective, the $210 million FDV is indeed at a 'ground price' in the current L1 race, but this is more a reasonable pricing of its ecological cold start risk. The current circulating supply of 3.77 billion may seem ample, but the data inflation generated by high-frequency trading will impose exponential demands on node hardware in the future, which is an invisible cost that cannot be ignored. If Fogo wants to break through, it must prove its resilience under extreme market conditions, rather than merely acting as a backup when Solana is congested. In the year 2026, with infrastructure surplus, creating a Layer 1 dedicated to trading is a risky move, but if it can stabilize this high-frequency trading territory, it may redefine the valuation logic of public chains. #fogo $FOGO
And in 40 milliseconds of extreme speed, I saw what Solana most wanted to become: a deep experience of the Fogo mainnet and the brutal truth.When I first crossed USDC into the mainnet, my fingers were still hovering over the keyboard, ready to habitually refresh the browser, waiting for that anxiety-inducing spinning animation to end. The word 'Confirmed' popped up on the screen almost instantaneously, and it felt like being used to driving an old manual pickup truck, suddenly being thrust into an F1 race car. The push-back sensation at the start was both exciting and a bit overwhelming. This kind of experience is rare in today's oversaturated L1 market, where most so-called 'high-performance chains' often remain at the beautiful TPS data in white papers. My first impression of Fogo is that this speed is a visible, tangible physical reality. With this surprise and skepticism, I spent an entire week navigating this new ecosystem that claims to be 'born for trading,' trying to figure out whether it is merely a shadow of Solana or an outlier that can break through in this crowded race.

And in 40 milliseconds of extreme speed, I saw what Solana most wanted to become: a deep experience of the Fogo mainnet and the brutal truth.

When I first crossed USDC into the mainnet, my fingers were still hovering over the keyboard, ready to habitually refresh the browser, waiting for that anxiety-inducing spinning animation to end. The word 'Confirmed' popped up on the screen almost instantaneously, and it felt like being used to driving an old manual pickup truck, suddenly being thrust into an F1 race car. The push-back sensation at the start was both exciting and a bit overwhelming. This kind of experience is rare in today's oversaturated L1 market, where most so-called 'high-performance chains' often remain at the beautiful TPS data in white papers. My first impression of Fogo is that this speed is a visible, tangible physical reality. With this surprise and skepticism, I spent an entire week navigating this new ecosystem that claims to be 'born for trading,' trying to figure out whether it is merely a shadow of Solana or an outlier that can break through in this crowded race.
Why do today's "AI public chains" seem like they are doing illegal renovations? Recently, I reviewed the code logic of several of the most fervently promoted AI concept public chains. To be honest, it made me feel quite awkward. Everyone seems to have fallen into a huge misunderstanding: thinking that as long as the TPS is high enough and the fees are low enough, AI can be run on the chain. But this is like adding a 5G module to an abacus; it looks advanced but is actually absurd. This is also why when I delved into the tech stack of @Vanar , I found it a bit "alternative"—it does not solve the problem of being "fast," but addresses the most fatal issue of AI: "forgetfulness." Running AI agents in the existing EVM system, the biggest pain point is actually the lack of context. Other competing public chains seem more like they are doing "old house renovations," trying to forcibly insert complex neural network inference data into a ledger that is only suitable for bookkeeping. The result is that every step the agent takes in reasoning requires gas to fetch data from the chain, or simply throws the data off-chain. How can we talk about decentralization? Vanar's approach is interesting; its myNeutron design clearly sees this point. It is not about comparing who can transfer money faster, but about adding a "hippocampus" to the blockchain. I looked around at the competitors, and most of the so-called "AI infrastructure" is still stuck at the stage of selling API interfaces, without reaching the underlying data validation. If the underlying chain cannot natively support semantic understanding, then the AI above will forever be "blind." Vanar's approach of directly integrating semantic memory layers into the infrastructure, while sounding a bit obscure, is actually the environment that developers can truly use. It's like one is a notebook that can only store 1s and 0s, while the other is a database that can understand logic; these two are not on the same dimensional level. Of course, the current market is still not big enough, and the ecosystem is not as lively as those chains filled with random projects. But from a technical aesthetic perspective, this kind of native architecture is much more pleasing than those patched-together hybrids. In the future M2M (machine-to-machine) economy, it is not about who can transfer funds the most times in a second, but about who can enable AI to "think" cheaply and reliably. #vanar $VANRY
Why do today's "AI public chains" seem like they are doing illegal renovations?
Recently, I reviewed the code logic of several of the most fervently promoted AI concept public chains. To be honest, it made me feel quite awkward. Everyone seems to have fallen into a huge misunderstanding: thinking that as long as the TPS is high enough and the fees are low enough, AI can be run on the chain. But this is like adding a 5G module to an abacus; it looks advanced but is actually absurd. This is also why when I delved into the tech stack of @Vanarchain , I found it a bit "alternative"—it does not solve the problem of being "fast," but addresses the most fatal issue of AI: "forgetfulness."
Running AI agents in the existing EVM system, the biggest pain point is actually the lack of context. Other competing public chains seem more like they are doing "old house renovations," trying to forcibly insert complex neural network inference data into a ledger that is only suitable for bookkeeping. The result is that every step the agent takes in reasoning requires gas to fetch data from the chain, or simply throws the data off-chain. How can we talk about decentralization? Vanar's approach is interesting; its myNeutron design clearly sees this point. It is not about comparing who can transfer money faster, but about adding a "hippocampus" to the blockchain.
I looked around at the competitors, and most of the so-called "AI infrastructure" is still stuck at the stage of selling API interfaces, without reaching the underlying data validation. If the underlying chain cannot natively support semantic understanding, then the AI above will forever be "blind." Vanar's approach of directly integrating semantic memory layers into the infrastructure, while sounding a bit obscure, is actually the environment that developers can truly use. It's like one is a notebook that can only store 1s and 0s, while the other is a database that can understand logic; these two are not on the same dimensional level.
Of course, the current market is still not big enough, and the ecosystem is not as lively as those chains filled with random projects. But from a technical aesthetic perspective, this kind of native architecture is much more pleasing than those patched-together hybrids. In the future M2M (machine-to-machine) economy, it is not about who can transfer funds the most times in a second, but about who can enable AI to "think" cheaply and reliably. #vanar $VANRY
Don't be fooled by TPS: I saw the real 'skeleton' of Web3 AI in the Vanar codebaseAt three in the morning, staring at the fluctuating $VANRY price curve on the screen, the coffee beside me had long gone cold. Outside the window, the street was quiet, with only a few streetlights still struggling to stay on, resembling a market caught between a deep bearish and early bullish phase—calm on the surface, with undercurrents surging beneath. Over the past week, I have almost read through all the white papers of projects claiming to be 'AI public chains' on the market; to be honest, the more I read, the more I wanted to laugh. This circle is too restless, with 99% of the projects playing word games, simply connecting GPT's API to the chain and daring to call themselves AI infrastructure. It feels like someone put a Ferrari sticker on a tractor and then told you this thing can run in F1. With this almost scrutinizing and nitpicking mindset, I reopened the @Vanar technical documentation, wanting to see if the chain that recently made waves in the Base ecosystem was also 'hanging a sheep's head while selling dog meat.' But when I truly focused on reviewing its architectural logic, especially after comparing it to the current Ethereum and Solana, I broke out in a cold sweat. We might have always been heading in the wrong direction: what AI really needs is not a faster carriage, but entirely new tracks.

Don't be fooled by TPS: I saw the real 'skeleton' of Web3 AI in the Vanar codebase

At three in the morning, staring at the fluctuating $VANRY price curve on the screen, the coffee beside me had long gone cold. Outside the window, the street was quiet, with only a few streetlights still struggling to stay on, resembling a market caught between a deep bearish and early bullish phase—calm on the surface, with undercurrents surging beneath. Over the past week, I have almost read through all the white papers of projects claiming to be 'AI public chains' on the market; to be honest, the more I read, the more I wanted to laugh. This circle is too restless, with 99% of the projects playing word games, simply connecting GPT's API to the chain and daring to call themselves AI infrastructure. It feels like someone put a Ferrari sticker on a tractor and then told you this thing can run in F1. With this almost scrutinizing and nitpicking mindset, I reopened the @Vanarchain technical documentation, wanting to see if the chain that recently made waves in the Base ecosystem was also 'hanging a sheep's head while selling dog meat.' But when I truly focused on reviewing its architectural logic, especially after comparing it to the current Ethereum and Solana, I broke out in a cold sweat. We might have always been heading in the wrong direction: what AI really needs is not a faster carriage, but entirely new tracks.
Staring at the account balance that has plummeted to a disfigured state, I have also had the thought of directly blacklisting XPL, after all, a nearly 90% drawdown is a form of psychological bullying for any holder. But as someone who habitually looks for logic in code, when the emotions recede, I instead see the true value that has been misjudged by the market. We have been shouting about Mass Adoption in Web3 for too long, but the reality is extremely fragmented: if you want to transfer some USDT to a friend outside the circle, you first have to teach him how to buy ETH or SOL for Gas fees. This inhumane barrier has directly kept 99% of new users out. So when I reassess @Plasma , that Paymaster mechanism really struck me; it allows you to transfer stablecoins without needing to hold the native token at all. This smooth 'Zero Gas' experience truly has a shadow of Alipay or WeChat Pay. Compared to those competing public chains that are still competing on TPS while ignoring interactive experience, this is the fundamental logic that the payment track should have. Money is the smartest; it never looks at the hype but only at safety. I went to dig up on-chain data, and the SyrupUSDT lending pool on Maple Finance surprisingly has a steady TVL of 1.1 billion dollars. Such an amount of institutional funds is a very honest vote. Institutions dare to place this huge sum here, primarily because it regularly anchors its status to the Bitcoin network design. This architecture leveraging BTC's security is much more substantial than many PoS chains that purely rely on their own consensus. Another point that is easy to overlook is that the Rain cards cover 150 million merchants and Oobit connects to the Visa network; these are solid payment implementations, not airy PPTs floating in the sky. Of course, I must also objectively critique the current problems. The degree of centralization in the current validator network is still very high, and the traces of team control are obvious. This is always the Damocles sword hanging over our heads. Moreover, the lack of ecological applications is an undeniable fact. Aside from transfers and lending, I find it challenging to locate an interesting DApp to interact with, and this ecological barrenness limits the efficiency of capital flow. But in today's market, where even the white papers are lazily written, projects that can genuinely address 'payment pain points' and have a huge TVL backing may deserve a bit more patience and time from me in this extremely pessimistic price range. #plasma $XPL
Staring at the account balance that has plummeted to a disfigured state, I have also had the thought of directly blacklisting XPL, after all, a nearly 90% drawdown is a form of psychological bullying for any holder. But as someone who habitually looks for logic in code, when the emotions recede, I instead see the true value that has been misjudged by the market. We have been shouting about Mass Adoption in Web3 for too long, but the reality is extremely fragmented: if you want to transfer some USDT to a friend outside the circle, you first have to teach him how to buy ETH or SOL for Gas fees. This inhumane barrier has directly kept 99% of new users out. So when I reassess @Plasma , that Paymaster mechanism really struck me; it allows you to transfer stablecoins without needing to hold the native token at all. This smooth 'Zero Gas' experience truly has a shadow of Alipay or WeChat Pay. Compared to those competing public chains that are still competing on TPS while ignoring interactive experience, this is the fundamental logic that the payment track should have.
Money is the smartest; it never looks at the hype but only at safety. I went to dig up on-chain data, and the SyrupUSDT lending pool on Maple Finance surprisingly has a steady TVL of 1.1 billion dollars. Such an amount of institutional funds is a very honest vote. Institutions dare to place this huge sum here, primarily because it regularly anchors its status to the Bitcoin network design. This architecture leveraging BTC's security is much more substantial than many PoS chains that purely rely on their own consensus. Another point that is easy to overlook is that the Rain cards cover 150 million merchants and Oobit connects to the Visa network; these are solid payment implementations, not airy PPTs floating in the sky.
Of course, I must also objectively critique the current problems. The degree of centralization in the current validator network is still very high, and the traces of team control are obvious. This is always the Damocles sword hanging over our heads. Moreover, the lack of ecological applications is an undeniable fact. Aside from transfers and lending, I find it challenging to locate an interesting DApp to interact with, and this ecological barrenness limits the efficiency of capital flow. But in today's market, where even the white papers are lazily written, projects that can genuinely address 'payment pain points' and have a huge TVL backing may deserve a bit more patience and time from me in this extremely pessimistic price range. #plasma $XPL
When the Narrative of Parallel EVM Recedes: My Real Experience Running a Full Node on Plasma Chain for a Week and Cold Thoughts on Reth ArchitectureStaring at the constantly jumping block height on the screen, watching the Reth client throughput logs flow like a waterfall in the terminal on the left, this is probably the moment that has brought me the most inner peace this week. Outside, the market is in turmoil, everyone is chasing the expectations of Monad's launch and Berachain's liquidity mining, as if not participating in those games with triple-digit APY would make you abandoned by this industry. Yet, at this time, I chose to tinker for a week on this so-called 'old concept reboot' @Plasma chain, not only for that potential node reward but also to figure out whether there are other paths to take in 2026 when we talk about high-performance L1, besides stacking hardware and modifying consensus.

When the Narrative of Parallel EVM Recedes: My Real Experience Running a Full Node on Plasma Chain for a Week and Cold Thoughts on Reth Architecture

Staring at the constantly jumping block height on the screen, watching the Reth client throughput logs flow like a waterfall in the terminal on the left, this is probably the moment that has brought me the most inner peace this week. Outside, the market is in turmoil, everyone is chasing the expectations of Monad's launch and Berachain's liquidity mining, as if not participating in those games with triple-digit APY would make you abandoned by this industry. Yet, at this time, I chose to tinker for a week on this so-called 'old concept reboot' @Plasma chain, not only for that potential node reward but also to figure out whether there are other paths to take in 2026 when we talk about high-performance L1, besides stacking hardware and modifying consensus.
Don't be fooled by computing power leasing anymore, @Vanar this native 'brain' architecture is a bit interesting. This round of AI competition is heating up, but the more I look, the more I feel something is wrong. The screen is full of DePIN talking about computing power leasing, which is just a decentralized AWS landlord, right? This simple physical stacking is a far cry from truly native Web3 intelligence. Recently, I went to review Vanar's white paper and GitHub, initially with a critical mindset, and found that these people have quite unique ideas. They didn't compete on TPS or computing power distribution, but focused on a pain point that many public chains haven't figured out: the cost of 'memory' and 'reasoning' for on-chain AI. We in technology know that Ethereum, as a state machine, is essentially forgetful. If you want AI Agents to run on-chain, simply uploading the model is useless; where will the massive contextual data generated during the reasoning process be stored? Storing it on Arweave is too slow, and storing it on-chain incurs exorbitant gas fees. The Neutron architecture developed by Vanar amused me; isn't this just installing a hippocampus on the blockchain? By using TensorRT for reasoning optimization, complex semantic data is compressed into on-chain readable Seeds, which means Agents are no longer 'fools' who compute from scratch with every interaction, but possess low-cost continuous memory capabilities. This leaves projects still relying on cross-chain bridges to connect to GPT-4 far behind; the former allows the blockchain to learn to think, while the latter is at best just making a long-distance call to AI. To be honest, the ecosystem experience is still quite early; after going around DApps, it feels a bit 'desolate', and the interface interactions still have bugs. Yesterday, that swap was just spinning around for a long time, but this underlying logical closed loop makes me feel very solid. Compared to those projects that look flashy on PPT but essentially just sell nodes, Vanar is clearly laying the most challenging 'computing infrastructure'. If DeFi can really evolve into an AI-driven dynamic risk control model in the future, or if on-chain game NPCs can have self-awareness, then the foundation must be on chains that can natively handle high-concurrency reasoning, rather than those outdated ones that compromise performance to accommodate EVM. For those of us doing research, we shouldn't just focus on short-term K-line fluctuations; we need to see whether the code base is stacking blocks or building engines. #vanar $VANRY
Don't be fooled by computing power leasing anymore, @Vanarchain this native 'brain' architecture is a bit interesting. This round of AI competition is heating up, but the more I look, the more I feel something is wrong. The screen is full of DePIN talking about computing power leasing, which is just a decentralized AWS landlord, right? This simple physical stacking is a far cry from truly native Web3 intelligence. Recently, I went to review Vanar's white paper and GitHub, initially with a critical mindset, and found that these people have quite unique ideas. They didn't compete on TPS or computing power distribution, but focused on a pain point that many public chains haven't figured out: the cost of 'memory' and 'reasoning' for on-chain AI. We in technology know that Ethereum, as a state machine, is essentially forgetful. If you want AI Agents to run on-chain, simply uploading the model is useless; where will the massive contextual data generated during the reasoning process be stored? Storing it on Arweave is too slow, and storing it on-chain incurs exorbitant gas fees. The Neutron architecture developed by Vanar amused me; isn't this just installing a hippocampus on the blockchain? By using TensorRT for reasoning optimization, complex semantic data is compressed into on-chain readable Seeds, which means Agents are no longer 'fools' who compute from scratch with every interaction, but possess low-cost continuous memory capabilities. This leaves projects still relying on cross-chain bridges to connect to GPT-4 far behind; the former allows the blockchain to learn to think, while the latter is at best just making a long-distance call to AI. To be honest, the ecosystem experience is still quite early; after going around DApps, it feels a bit 'desolate', and the interface interactions still have bugs. Yesterday, that swap was just spinning around for a long time, but this underlying logical closed loop makes me feel very solid. Compared to those projects that look flashy on PPT but essentially just sell nodes, Vanar is clearly laying the most challenging 'computing infrastructure'. If DeFi can really evolve into an AI-driven dynamic risk control model in the future, or if on-chain game NPCs can have self-awareness, then the foundation must be on chains that can natively handle high-concurrency reasoning, rather than those outdated ones that compromise performance to accommodate EVM. For those of us doing research, we shouldn't just focus on short-term K-line fluctuations; we need to see whether the code base is stacking blocks or building engines. #vanar $VANRY
When AI Runs on the 'Dirt Road' of Blockchain: A Harsh Experimental Note on Vanar, Computational Costs, and So-called 'Nativenes'Staring at the time in the lower right corner of the screen, it's three thirty in the morning. This kind of schedule has long been the norm for those of us who have been struggling in this circle, especially when you try to run a truly functional on-chain Agent logic. The feeling of frustration often comes stronger than the feeling of sleepiness. In the past month, I've almost gone through all the project documents on the market that claim to be 'AI public chains.' It feels like looking for load-bearing walls in a bunch of finely decorated model houses; the surface is glamorous, full of narratives and concepts, but when you dig in, it’s all just bubbles. The so-called AI narratives of most projects are nothing more than just putting an API interface on traditional EVM chains, or creating a useless computing power leasing market. This spliced architecture is, in my opinion, an insult to decentralized intelligence. I don’t want to name names, but it’s quite laughable to see those projects with market values in the tens of billions still using two-year-old sidechain logic to tell AI stories.

When AI Runs on the 'Dirt Road' of Blockchain: A Harsh Experimental Note on Vanar, Computational Costs, and So-called 'Nativenes'

Staring at the time in the lower right corner of the screen, it's three thirty in the morning. This kind of schedule has long been the norm for those of us who have been struggling in this circle, especially when you try to run a truly functional on-chain Agent logic. The feeling of frustration often comes stronger than the feeling of sleepiness. In the past month, I've almost gone through all the project documents on the market that claim to be 'AI public chains.' It feels like looking for load-bearing walls in a bunch of finely decorated model houses; the surface is glamorous, full of narratives and concepts, but when you dig in, it’s all just bubbles. The so-called AI narratives of most projects are nothing more than just putting an API interface on traditional EVM chains, or creating a useless computing power leasing market. This spliced architecture is, in my opinion, an insult to decentralized intelligence. I don’t want to name names, but it’s quite laughable to see those projects with market values in the tens of billions still using two-year-old sidechain logic to tell AI stories.
Login to explore more contents
Explore the latest crypto news
⚡️ Be a part of the latests discussions in crypto
💬 Interact with your favorite creators
👍 Enjoy content that interests you
Email / Phone number
Sitemap
Cookie Preferences
Platform T&Cs