Open In App

Node.js Streams

Last Updated : 09 Aug, 2024
Comments
Improve
Suggest changes
Like Article
Like
Report

Node.js streams are a key part of handling I/O operations efficiently. They provide a way to read or write data continuously, allowing for efficient data processing, manipulation, and transfer.\

Node.js Streams

The stream module in Node.js provides an abstraction for working with streaming data. Streams are a powerful feature that allows handling data in chunks rather than loading it all at once, which is efficient for both memory usage and performance.

The official Node.js documentation defines streams as "A stream is an abstract interface for working with streaming data in Node.js." The stream module provides an API for implementing the stream interface. Examples of the stream object in Node.js can be a request to an HTTP server and process.stdout are both stream instances. In short, Streams are objects in Node.js that lets the user read data from a source or write data to a destination in a continuous manner.

Accessing Streams:

const stream = require('stream');

Note: What makes streams powerful while dealing with large amounts of data is that instead of reading a file into memory all at once, streams actually read chunks of data, processing its content data without keeping it all in memory.

Advantages of Streams over other data handling methods:

  • Time Efficient: We don't have to wait until entire file has been transmitted. We can start processing data as soon as we have it.
  • Memory Efficient: We don't have to load huge amount of data in memory before we start processing.

Types of Streams in Node.js:

There are namely four types of streams in Node.js.

  1. Writable: We can write data to these streams. e.g., fs.createWriteStream().
  2. Readable: We can read data from these streams. e.g., fs.createReadStream().
  3. Duplex: Streams that are both, Writable as well as Readable. e.g., net.socket.
  4. Transform: Streams that can modify or transform the data as it is written and read. e.g., zlib.createDeflate.

Explore this Node.js Streams Module Complete Reference for detailed insights, practical examples, and expert guidance on handling data streams efficiently, from reading and writing to transforming and piping, to enhance your Node.js applications.

Implementing a Readable Stream:

We will read the data from inStream and echoing it to the standard output using process.stdout.

JavaScript
// Sample JavaScript Code for creating
// a Readable Stream
// Accessing streams
const { Readable } = require('stream');

// Reading the data 
const inStream = new Readable({
    read() { }
});

// Pushing the data to the stream
inStream.push('GeeksForGeeks : ');
inStream.push(
    'A Computer Science portal for Geeks');

// Indicates that no more data is
// left in the stream
inStream.push(null);

// Echoing data to the standard output
inStream.pipe(process.stdout);

Output:

GeeksForGeeks : A Computer Science portal for Geeks 

Implementing a Writable Stream:

In the outStream, we simply console.log the chunk as a string. We also call the callback function to indicate success without any errors. We will read the data from inStream and echo it to the standard output using process.stdout.

JavaScript
// Sample JavaScript Code for
// Writable Stream
// Accessing Streams
const { Writable } = require('stream');

// Whatever is passed in standard 
// input is out streamed here.
const outStream = new Writable({

    // The Write function takes three 
    // arguments
    // Chunk is for Buffer
    // Encoding is used in case we want
    // to configure the stream differently
    // In this sample code, Encoding is ignored 
    // callback is used to indicate 
    // successful execution
    write(chunk, encoding, callback) {
        console.log(chunk.toString());
        callback();
    }

});

// Echo the data to the standard output
process.stdin.pipe(outStream);

Output:

Hello Geeks

Benefits

  • Efficiency: Handles large data sets efficiently by processing data in chunks.
  • Memory Usage: Reduces memory consumption as data is processed incrementally rather than loaded all at once.
  • Performance: Improves performance by allowing continuous data flow and processing.

Summary

Node.js streams are a versatile and powerful tool for managing data flow and I/O operations efficiently. By leveraging different types of streams, events, and methods like pipe(), developers can handle large volumes of data effectively, ensuring optimal performance and memory usage in their applications.


Next Article
Article Tags :

Similar Reads