Skip to main content

Getting Started with Streams

Updated on
Jun 12, 2025

Overview

Streams is a powerful data ingestion and streaming solution designed for web3 applications. It enables you to efficiently collect, process, and store both raw and filtered blockchain data in real-time and historically. Whether you're building indexers, analytics platforms, or real-time dashboards, Streams provides the infrastructure to create robust data pipelines with features like batching, backfill, and continuous streaming.

Why Streams?

Streams revolutionizes blockchain data access by implementing a push-based architecture that delivers data directly to your storage or indexing systems. This event-driven model provides several key advantages:


  • Efficient Data Delivery: Get exactly-once delivery of blockchain data in finality order
  • Real-time Processing: Access blockchain data as it happens, perfect for building live dashboards and real-time analytics
  • Historical Data Access: Easily backfill historical data with configurable batch sizes
  • Server-side Filtering: Process and transform data on QuickNode's infrastructure using JavaScript filters before it reaches your system
  • Seamless Integration: Connect directly to your preferred data storage or webhook backend

While traditional JSON-RPC polling methods can be complex to implement and maintain, Streams provides a more elegant solution that handles the complexities of blockchain data access for you.

Streams Features


  • JavaScript-Based Filtering - Define custom filters in JavaScript to process and transform blockchain data on QuickNode's infrastructure before it reaches your destination. This server-side filtering reduces data transfer and processing overhead.
  • Flexible Data Ingestion - Streams supports both raw blockchain data and filtered datasets. Use JavaScript filters to extract exactly the data you need for your indexing or analytics use case, processed on our infrastructure.
  • Historical & Real-time Processing - Build comprehensive data pipelines that handle both historical backfills and real-time streaming, perfect for indexers and analytics platforms.
  • Guaranteed Data Delivery - Streams ensures exactly-once delivery of blocks, receipts, and traces in finality order, maintaining data integrity for your indexing and analytics systems.
  • Efficient Batch Processing - Configure data batches of multiple blocks for optimal historical data ingestion.
  • Real-time Analytics Support - Stream consistent, real-time data to power live dashboards and analytics platforms.
  • Operational Transparency - Monitor your data pipelines through detailed logs and performance metrics, with usage tracking for cost optimization.

Streams Lifecycle

Streams are data pipelines that fetch, process, and deliver blockchain data to your storage or indexing systems. Each stream can be configured to handle specific data types (blocks, receipts, traces, etc.) and can be customized with filters to match your indexing or analytics requirements. The lifecycle of a stream consists of the following stages:


  • Active: Streams continuously ingests and processes data according to your configuration, delivering it to your storage or backend system.
  • Paused: You can temporarily halt data ingestion and processing while maintaining your stream configuration.
  • Terminated: Data ingestion has been stopped due to delivery failures.
  • Completed: The stream has successfully processed all data within its defined range (for historical backfills).

JavaScript Filtering

Streams allows you to define custom JavaScript filters that process blockchain data on QuickNode's infrastructure before it reaches your destination. This server-side filtering capability offers several advantages:


  • Reduced Data Transfer: Pay for what you need. Only the filtered data that matches your criteria is sent to your destination, reducing data size and storage requirements.
  • Custom Transformations: Transform and enrich the data using JavaScript before it reaches your system, including:
    • Filtering specific transaction types or events
    • Extracting and transforming specific fields
    • Computing derived values
    • Aggregating data points
  • Performance Optimization: Processing data on QuickNode's infrastructure reduces the computational load on your systems.
  • Flexible Filtering: Create complex filtering logic using JavaScript's full feature set, including:
    • Regular expressions for pattern matching
    • Mathematical operations for value calculations
    • Array and object manipulation for data transformation
    • Conditional logic for complex filtering rules

Example Filters

Solana Stake Program Filter

This filter finds every successful transaction in each block that includes at least one instruction sent to the Solana Stake Program — such as staking, unstaking, or moving stake:

function main(stream) {
const STAKE_PROGRAM = "Stake11111111111111111111111111111111111111";
const matches = [];

for (const block of stream.data) {
for (const tx of block.transactions) {
if (!tx.meta || tx.meta.err) continue;

const stakeInstructions = (tx.transaction.message.instructions || []).filter(
ix => ix.programId === STAKE_PROGRAM
);

if (stakeInstructions.length) {
matches.push({
signature: tx.transaction.signatures[0],
blockTime: block.blockTime,
instructionCount: stakeInstructions.length,
programId: STAKE_PROGRAM
});
}
}
}

return matches.length ? matches : null;
}

Ethereum ERC-20 Transfer Filter

This filter processes ERC-20 transfer events using the block with receipts dataset, decoding the logs using the ERC-20 ABI:

function main(stream) {
const ERC20_ABI = [{
name: "Transfer",
type: "event",
anonymous: false,
inputs: [
{ name: "from", type: "address", indexed: true },
{ name: "to", type: "address", indexed: true },
{ name: "value", type: "uint256", indexed: false }
]
}];

const receipts = stream.data.flatMap(block => block.receipts || []);
const decoded = decodeEVMReceipts(receipts, [ERC20_ABI]);

const erc20Transfers = decoded.filter(r =>
r.decodedLogs?.some(log => log.name === "Transfer")
);

return erc20Transfers.length ? { receipts: erc20Transfers } : null;
}

This filter demonstrates how to:

  • Use the block with receipts dataset
  • Define and use an ABI for event decoding
  • Process decoded logs to extract ERC-20 transfer events
  • Return only receipts containing relevant events

Feature Availability

Streams is available to all users with a QuickNode plan. For teams with unique requirements, we offer tailored datasets, dedicated support, and custom integrations. Contact our team for more information. For a full breakdown of Streams features by plan, visit the pricing page. \

Solana Streams on Free Plans

Free accounts can only create Solana Streams that follow the tip of the blockchain in real time (i.e., subscription to new blocks as they are produced). Historical backfills on Solana Streams are only available on paid plans.

This limitation is just for Solana Streams. All other chains are available on Free plans with full historical backfills.

Access

Access Streams through the QuickNode Developer Portal and via Streams REST API.

Supported Chains

Streams streaming can be quickly provisioned for any of the supported chain and networks mentioned below. For teams with unique requirements contact our team for more information. Otherwise, our self-serve experience has you covered on the following chains:

ChainMainnetTestnets
AbstractTestnet
ArbitrumSepolia
Arbitrum Nova
Avalanche C-ChainFuji
B3Sepolia
BaseSepolia
BeraBepolia
Bitcoin
BlastSepolia
BNB Smart ChainTestnet
Celo
CyberSepolia
EthereumHolesky, Hoodi, Sepolia
Fantom
Fraxtal
Gnosis
Immutable zkEVMTestnet
InkSepolia
KaiaTestnet
Linea
MantleSepolia
Mode
MonadNot AvailableTestnet
MorphComing soonHolesky
OmniComing soonOmega
OptimismSepolia
PolygonAmoy
Polygon zkEVM
RaceTestnet
Redstone
ScrollTestnet
SeiTestnet
SolanaDevnet, Testnet
StoryComing soonTestnet
Tron
UnichainSepolia
XaiSepolia
zkSyncSepolia
Zora

Data Fetch Limits

Streams data fetch limits are designed to support various data ingestion and indexing workloads across different blockchain networks. These limits ensure optimal performance for your data pipelines while maintaining system stability. Below, you'll find the specific data fetch limits for each chain and network, segmented by QuickNode plans.

Understanding the Limits


  • Messages per Second (mps): This metric represents the maximum throughput of data your streams can process per second, based on your subscription plan. This is crucial for both real-time indexing and historical data backfilling.
  • Chain-Specific Overrides: Some chains may have specific limits that differ from the standard rates due to their unique characteristics or infrastructure requirements. We continuously work to optimize these rates for all supported chains.

The total data ingestion rate is capped by a defined limit for each chain, based on your subscription plan. This means the combined throughput—measured in Messages per Second (mps) for all your streams targeting a single blockchain network cannot exceed your plan's rate limit.

Optimizing Your Data Pipeline


  • Destination Optimization: Ensure your storage or indexing system is properly configured to handle the incoming data rate. This includes:
    • Optimizing database indexes and write patterns
    • Configuring appropriate buffer sizes
    • Setting up efficient data partitioning strategies
    • Important: Streams processes blocks sequentially and will not proceed to the next block or batch until receiving confirmation that the current block or batch was successfully delivered to your destination. This ensures data consistency but means your destination must be able to process and acknowledge receipt of each block or batch within a reasonable timeframe.
  • Geographical Considerations: The rate at which you can fetch data may also depend on how close you are to the region where Streams is deployed. Data retrieval times can be optimized by selecting a deployment region that is geographically closer to you or your destination.
  • Performance Monitoring: Regularly check your Streams dashboard to monitor your usage and to adjust your streams as needed.
  • Scaling Your Pipeline: If you find your data needs increasing beyond what your current plan includes, consider upgrading to a higher plan. This will not only increase your fetch limits but also potentially offer additional features and capabilities to support your growing requirements.

We ❤️ Feedback!

If you have any feedback or questions about this documentation, let us know. We'd love to hear from you!

Share this doc