Streams Intro

Move Data Through Pipes — Not Buffers

Streams Intro

Streams let you process data as it arrives, in chunks, without loading everything into memory.

4 min read Level 2/5 #nodejs#streams#io
What you'll learn
  • Understand the stream concept
  • Read a file as a stream
  • Pipe one stream into another

readFile() loads the whole file into memory. For a 10MB file, no problem. For a 10GB file — your process dies.

Streams process data in chunks as it arrives. Constant memory regardless of input size.

A Read Stream

import { createReadStream } from "node:fs";

const stream = createReadStream("huge-log.txt", { encoding: "utf8" });

for await (const chunk of stream) {
  // chunk is ~64KB by default
  console.log("got", chunk.length, "chars");
}

Node 22 streams are async iterators — for await just works.

Piping

The classic stream pattern: data flows from a source to a sink, with optional transforms.

import { createReadStream, createWriteStream } from "node:fs";
import { pipeline } from "node:stream/promises";
import { createGzip } from "node:zlib";

await pipeline(
  createReadStream("huge.txt"),
  createGzip(),
  createWriteStream("huge.txt.gz")
);
console.log("done");

This compresses a file of any size with constant memory. pipeline handles errors and cleanup correctly — better than the older .pipe() chain.

Why Streams Win

ApproachMemoryLatencyWhen
readFile then processAll in RAMHave to wait for full readSmall files
Stream chunk-by-chunkA few KBFirst chunk arrives instantlyLarge files, network

Real-World Streams You’ll Meet

  • req / res in an HTTP server — both are streams
  • File reads/writes
  • Zlib compression
  • TLS sockets
  • child_process stdin/stdout/stderr

If it produces or consumes data over time, it’s probably a stream.

Up Next

The two main kinds of streams: Readable and Writable.

Readable & Writable →