Streaming Responses

Pipe a File — Or a Big JSON — Without Buffering

Streaming Responses

Stream large responses chunk-by-chunk. Constant memory, fast TTFB.

3 min read Level 2/5 #express#streaming#performance
What you'll learn
  • Stream a file with res.sendFile or pipeline
  • Stream a large JSON array
  • Handle backpressure

res.json(huge) loads the entire payload into memory before sending. For large responses, stream chunks as they’re ready.

A File

Easiest case — res.sendFile:

import path from "node:path";

app.get("/files/:id", (req, res) => {
  const file = path.resolve("./uploads", `${req.params.id}.bin`);
  res.sendFile(file, (err) => {
    if (err) res.status(404).end();
  });
});

Or pipe a stream directly:

import { createReadStream } from "node:fs";
import { pipeline } from "node:stream/promises";

app.get("/files/:id", async (req, res) => {
  res.setHeader("content-type", "application/octet-stream");
  try {
    await pipeline(createReadStream(file), res);
  } catch (err) {
    res.status(500).end();
  }
});

Constant memory regardless of file size.

Big JSON

For a 1GB query result, res.json(rows) will OOM. Stream as NDJSON (one JSON object per line):

app.get("/api/export", async (req, res) => {
  res.setHeader("content-type", "application/x-ndjson");

  const cursor = db.users.cursor();   // an iterable from your DB
  for await (const row of cursor) {
    res.write(JSON.stringify(row) + "\n");
  }
  res.end();
});

The client parses one line at a time — never loads everything into memory either.

CSV

import { stringify } from "csv-stringify";
import { pipeline } from "node:stream/promises";

app.get("/api/export.csv", async (req, res) => {
  res.setHeader("content-type", "text/csv");
  res.setHeader("content-disposition", 'attachment; filename="export.csv"');

  const cursor = db.users.cursor();
  const formatter = stringify({ header: true, columns: ["id", "email", "name"] });

  await pipeline(cursor, formatter, res);
});

Handle Backpressure

If the client is slow, your stream can fill the OS buffer. pipeline() handles backpressure correctly — res.write() returns false when its buffer is full and you should pause your source.

Use pipeline whenever possible. It also handles errors and cleanup correctly.

Client Disconnects

If the client closes mid-stream, your producer keeps running and writes to a dead socket. Check req.aborted or pass an AbortSignal to your DB cursor:

app.get("/api/export", async (req, res) => {
  const ctrl = new AbortController();
  req.on("close", () => ctrl.abort());

  const cursor = db.users.cursor({ signal: ctrl.signal });
  for await (const row of cursor) {
    if (!res.write(JSON.stringify(row) + "\n")) {
      await new Promise((r) => res.once("drain", r));
    }
  }
  res.end();
});

A small touch that prevents zombie work.

Server-Sent Events →