Pipe a Node Stream Into reply.send
Streaming Responses
For large or generated payloads, return a readable stream from your handler. Fastify pipes it to the client without buffering the whole response.
What you'll learn
- Return a readable stream from a handler
- Set the right content type before sending
- Handle backpressure with pipeline
Buffering a 100 MB CSV into memory before responding is a recipe for OOM kills under load. Fastify lets you stream the response so the bytes flow straight from disk (or a generator) to the socket.
Stream a File
import { createReadStream } from 'node:fs'
app.get('/exports/sales.csv', async (req, reply) => {
reply.type('text/csv').header('Content-Disposition', 'attachment; filename=sales.csv')
return createReadStream('./reports/sales.csv')
}) Return the stream — Fastify recognises it and pipes it. Backpressure is handled internally; if the client reads slowly, Node pauses the source.
Generate a Stream On The Fly
You can stream any async iterable by wrapping it with Readable.from.
import { Readable } from 'node:stream'
async function* rows() {
for await (const row of app.db.bigQuery()) {
yield JSON.stringify(row) + '\n'
}
}
app.get('/ndjson', async (req, reply) => {
reply.type('application/x-ndjson')
return Readable.from(rows())
}) Newline-delimited JSON is a friendly format for clients to consume incrementally.
Composing Streams Safely
Use pipeline from node:stream/promises when chaining transforms — it propagates errors and aborts upstream if anything fails.
import { pipeline } from 'node:stream/promises'
import { createGzip } from 'node:zlib'
app.get('/big.json.gz', async (req, reply) => {
reply.type('application/json').header('Content-Encoding', 'gzip')
const src = createReadStream('./data/big.json')
await pipeline(src, createGzip(), reply.raw)
}) Note the use of reply.raw here — when you take full control of the response stream, write directly to the underlying Node response.