Make sure to catch The Graph at upcoming @EthereumDenver functions. These sessions are set to cover a broad array of interests, ranging from building institutional connections to exploring AI agents and privacy tech.
Whereas conventional indexers handle blocks sequentially, Substreams processes them in parallel. This method unlocks rapid synchronization speeds and real-time data streaming. It also provides infrastructure that scales with chain throughput and enables high-speed reprocessing to support smoother developer iteration.
For those making the trip to @EthereumDenver, The Graph has organized an exclusive VIP Institutional Soirée taking place on Feb 19. Together with our co-hosts @CantonNetwork, @blockaid_, and @HalbornSecurity, we are bringing together the institutional leaders who are defining the future of digital assets. This event provides a space for high-quality networking and is situated only 7 mins from the conference venue. 🍷
Creating specific data extraction logic for every single protocol inevitably leads to a fragmented landscape of incompatible formats, varied schemas, and redundant work for teams. Open standards provide the solution to this inefficiency. By utilizing data structures that are consistent across different chains and protocols, developers can move past the tedious task of infrastructure plumbing and focus their energy on creating products of value. The Graph established Subgraphs as an open standard, a resource that tens of thousands of developers have since adopted. Although establishing standards may not seem glamorous, it is the essential driver of real technological progress.
The Graph has secured a pivotal role in the emerging agent economy, effectively transforming from a standard indexing protocol into a comprehensive agent-native data marketplace. This evolution includes serving as the foundational data layer for ERC-8004, which manages identity and reputation checks across 8 chains. To address transaction efficiency, GraphTally is utilized to enable x402 micropayments that eliminate gas bottlenecks. Moreover, the platform supports natural language agent queries via MCP integration, and the x402 Subgraph Gateway is officially in development.
We are witnessing the agent economy evolve into truly practical infrastructure. It is now possible for AI agents to employ natural language when querying The Graph. This is achieved through an MCP agent, which receives plain English directives from peer agents and translates them into GraphQL queries compatible with The Graph Network. Additionally, development is ongoing for full x402 Subgraph Gateway compatibility, a feature that will eventually permit agents to manage query payments autonomously.
Although infrastructure rarely grabs the spotlight, it serves as the foundation that enables the entire ecosystem. Invisible data systems operating in the background are responsible for every accurately displayed DeFi transaction, wallet balance check, and app query. The Graph has processed trillions of queries because truly effective infrastructure goes unnoticed. We usually do not give it a second thought when it works, but if it fails, everything else breaks.
ERC-8004 acts as a registry for three vital aspects of AI agents: their Identity, which defines who they are; their Reputation, providing a history of their behavior; and Validation, which serves as proof that tasks were executed correctly. Through the publication of ERC-8004 Subgraphs across 8 chains, The Graph creates a consolidated directory for trust across different blockchains. This means an agent on Base can use a Subgraph query to instantly verify the standing of an agent on Arbitrum. Without such infrastructure, agents would be forced to scan through raw blockchain data to authenticate credentials, a process that would bring the network to a complete standstill.
As major chains race to deliver quicker block times, institutions adopting blockchain technology are requiring verifiable data. At the same time, AI agents depend on structured information to take action. The common link uniting these demands is data infrastructure, better known as The Graph.
For AI agents to operate successfully onchain, they require two essential elements: a distinct identity and established payment rails. The Graph facilitates this by offering the data layer necessary for both components to function at scale. Specifically, @ethereum ERC-8004 acts as the ID card used for verification and reputation purposes, while @coinbase x402 serves as the wallet designed for instant micropayments.
The velocity of block times is on the rise. As @Solana, @Base, and other L2s move toward sub-second blocks, your blockchain data infrastructure must stay up to speed, especially since some traders are relying on partial blocks. Substreams solves this by processing blocks the moment a validating node makes them available. It streams transformed data instantly, ensuring there are no backfill delays and no waiting involved.
Acting as the data layer for the multichain economy, The Graph now indexes 80+ networks. Through this single protocol, builders can gain access to blockchain data across a wide spectrum of platforms. Supported ecosystems range from @ethereum and @solana to @base, @avalanche, @arbitrum, @BNB, @optimism, @polygon, @unichain, and more.
Curious about the mechanics of Substreams? Activity on the blockchain is constant and never pauses. Every second, thousands of transactions occur across networks such as Ethereum, Solana, Arbitrum, BSC, and others. Substreams connects directly to Firehose to guarantee that every single piece of data is captured without exception. Consider this the starting line for everything 🚦
To operate effectively, the biggest multichain DeFi protocols require data infrastructure that is equal to their magnitude. Substreams serves as the engine for data pipelines within systems that manage billions in volume spanning dozens of chains. This tool is essential for high-pressure environments where dependability is non-negotiable.
A data layer acts as the underlying infrastructure that converts raw blockchain states into organized, searchable information. In the absence of this technology, every application is forced to replicate the entire indexing process from the beginning. However, by utilizing this layer, builders are free to dedicate their efforts to the core solutions they are creating.
Substreams modules are written in Rust and compiled to WASM. Why Rust? It provides performance, safety, and aligns with a growing ecosystem of blockchain developers already familiar with the language. This ensures your data transformations run at native speed.
Horizon doesn't alter your current usage of Subgraphs, but it does enable The Graph to host additional data services alongside them—giving you more options as your data needs evolve.
The Graph dedicated 2+ years to building Horizon.
We did this not because it was easy, but because a multi-service infrastructure is essential for the ecosystem as blockchain data requirements get more complex.
This tool provides static analysis for Subgraphs. It helps catch entity overwrites, null pointer issues, and other runtime bugs that might otherwise slip past the compiler.
Discover hundreds of ready-to-use Substreams packages available right now in the Substreams Registry.
Simply find the one that fits your use case, configure your sink, and start streaming—with absolutely no custom development required. Additionally, the modular architecture of Substreams enables you to use a Substreams package as an input for another, allowing you to effectively build on the work of other developers in the community.