Move Data Through Pipes — Not Buffers
Streams Intro
Streams let you process data as it arrives, in chunks, without loading everything into memory.
What you'll learn
- Understand the stream concept
- Read a file as a stream
- Pipe one stream into another
readFile() loads the whole file into memory. For a 10MB file, no
problem. For a 10GB file — your process dies.
Streams process data in chunks as it arrives. Constant memory regardless of input size.
A Read Stream
import { createReadStream } from "node:fs";
const stream = createReadStream("huge-log.txt", { encoding: "utf8" });
for await (const chunk of stream) {
// chunk is ~64KB by default
console.log("got", chunk.length, "chars");
} Node 22 streams are async iterators — for await just works.
Piping
The classic stream pattern: data flows from a source to a sink, with optional transforms.
import { createReadStream, createWriteStream } from "node:fs";
import { pipeline } from "node:stream/promises";
import { createGzip } from "node:zlib";
await pipeline(
createReadStream("huge.txt"),
createGzip(),
createWriteStream("huge.txt.gz")
);
console.log("done"); This compresses a file of any size with constant memory.
pipeline handles errors and cleanup correctly — better than the
older .pipe() chain.
Why Streams Win
| Approach | Memory | Latency | When |
|---|---|---|---|
readFile then process | All in RAM | Have to wait for full read | Small files |
| Stream chunk-by-chunk | A few KB | First chunk arrives instantly | Large files, network |
Real-World Streams You’ll Meet
req/resin an HTTP server — both are streams- File reads/writes
- Zlib compression
- TLS sockets
child_processstdin/stdout/stderr
If it produces or consumes data over time, it’s probably a stream.
Up Next
The two main kinds of streams: Readable and Writable.
Readable & Writable →