Node.js Streams

Node js 9 min min read Updated: Mar 29, 2026 Beginner
Node.js Streams
Beginner Topic 9 of 12

Node.js Streams

Streams are one of the most powerful features in Node.js. They allow applications to process data in small chunks instead of loading the entire data into memory at once. This makes streams extremely useful when working with large files, video content, network requests, or real-time data processing.

In simple terms, streams help Node.js handle large amounts of data efficiently. Instead of waiting for the full file or response to be available, Node.js can start processing the data piece by piece. This improves performance and reduces memory usage.

Key Concept: Streams let Node.js process data gradually, which is faster and more memory-efficient than reading everything at once.

What are Streams in Node.js?

A stream is a flow of data. In Node.js, streams are objects that allow you to read data from a source or write data to a destination continuously. They are especially useful when the data is too large to handle in one go.

For example, if you want to read a 2 GB file, loading the entire file into memory using a normal file read operation can be expensive. With streams, Node.js reads that file in smaller chunks and processes it step by step.

Why Use Streams?

  • Better performance: Data is processed as it arrives
  • Lower memory usage: No need to load the full file into memory
  • Efficient for large files: Ideal for logs, videos, backups, and uploads
  • Real-time processing: Useful in live applications and network communication

Types of Streams in Node.js

Node.js mainly provides four types of streams:

  • Readable Stream
  • Writable Stream
  • Duplex Stream
  • Transform Stream

1. Readable Stream

A readable stream is used to read data from a source. Common examples include reading a file, receiving an HTTP request body, or consuming network data.

Example:

javascript const fs = require("fs"); const readStream = fs.createReadStream("file.txt", "utf8"); readStream.on("data", (chunk) => { console.log("Received chunk:", chunk); });

In this example, the file is read in chunks instead of all at once.

2. Writable Stream

A writable stream is used to write data to a destination. For example, writing to a file, sending data in an HTTP response, or saving logs.

javascript const fs = require("fs"); const writeStream = fs.createWriteStream("output.txt"); writeStream.write("Hello "); writeStream.write("Node.js Streams"); writeStream.end();

This example writes data into a file step by step.

3. Duplex Stream

A duplex stream can both read and write data. In other words, it works as both a readable and writable stream at the same time.

A common example is a network socket, where data can be received and sent simultaneously.

4. Transform Stream

A transform stream is a special type of duplex stream where the output is modified based on the input. It reads data, transforms it, and then writes the transformed data.

Common examples include compression, encryption, and data formatting.

javascript const zlib = require("zlib"); const fs = require("fs"); const readStream = fs.createReadStream("file.txt"); const writeStream = fs.createWriteStream("file.txt.gz"); const gzip = zlib.createGzip(); readStream.pipe(gzip).pipe(writeStream);

In this example, the file content is read, compressed, and written into a new file.

Important Stream Events

Streams in Node.js work with events. Some common events are:

  • data – fired when a chunk of data is available
  • end – fired when reading is complete
  • finish – fired when writing is complete
  • error – fired when an error occurs
javascript readStream.on("end", () => { console.log("Reading completed"); }); readStream.on("error", (err) => { console.error("Error:", err); });

Using pipe() with Streams

One of the most useful methods with streams is pipe(). It connects a readable stream directly to a writable stream, making the code cleaner and more efficient.

javascript const fs = require("fs"); const readStream = fs.createReadStream("input.txt"); const writeStream = fs.createWriteStream("output.txt"); readStream.pipe(writeStream);

This copies the content of one file into another without manually handling each chunk.

Streams vs readFile()

Beginners often compare streams with methods like fs.readFile(). The difference is important:

Feature Streams readFile()
Memory usage Low High for large files
Performance Better for large data Fine for small files
Processing style Chunk by chunk Entire file at once

Real-World Use Cases of Streams

  • Reading large files such as logs or reports
  • Uploading and downloading files
  • Streaming video or audio content
  • Compressing or decompressing data
  • Handling network communication
  • Building real-time applications

Best Practices

  • Use streams for large files and continuous data
  • Always handle error events properly
  • Use pipe() when possible for cleaner code
  • Choose the correct stream type based on your use case

Common Mistakes

  • Using readFile() for very large files
  • Not handling stream errors
  • Forgetting to close writable streams properly
  • Confusing duplex and transform streams

Conclusion

Streams are a core part of Node.js and are essential for building efficient, scalable applications. They help process large amounts of data in a way that is fast and memory-friendly.

Once you understand readable, writable, duplex, and transform streams, you can handle many real-world backend tasks more effectively. Streams are especially valuable in file handling, APIs, media processing, and data pipelines.

Quick Summary: Node.js streams process data in chunks, making them ideal for handling large files and real-time data efficiently.

Get Newsletter

Subscibe to our newsletter and we will notify you about the newest updates on Edugators