@fastify/multipart

File Uploads & Streamed Parsing

@fastify/multipart

Parse multipart/form-data uploads as streams, with configurable size limits and helpers for accessing files and fields.

4 min read Level 2/5 #fastify#multipart#uploads
What you'll learn
  • Install @fastify/multipart
  • Read files via request.file or request.parts
  • Stream uploads to disk or S3 without buffering

File uploads in browsers use multipart/form-data. @fastify/multipart parses that format as streams so you can pipe a 10GB upload straight to S3 without buffering it in memory.

Install & Register

npm install @fastify/multipart
import multipart from '@fastify/multipart'

await app.register(multipart, {
  limits: {
    fileSize: 10 * 1024 * 1024,
    files: 5,
    fields: 20,
  },
})

fileSize is enforced as the stream flows — the plugin throws once the limit is exceeded, so you never load oversized data.

A Single File

import { pipeline } from 'node:stream/promises'
import { createWriteStream } from 'node:fs'
import { join } from 'node:path'

app.post('/upload', async (req, reply) => {
  const data = await req.file()
  if (!data) return reply.code(400).send({ error: 'no file' })

  const dest = join('/tmp/uploads', data.filename)
  await pipeline(data.file, createWriteStream(dest))

  return { ok: true, path: dest }
})

Multiple Parts (Files + Fields)

app.post('/profile', async (req) => {
  for await (const part of req.parts()) {
    if (part.type === 'file') {
      await pipeline(part.file, createWriteStream('/tmp/' + part.filename))
    } else {
      console.log('field', part.fieldname, '=', part.value)
    }
  }
  return { ok: true }
})

Always stream straight to your storage backend — saving to local disk is a useful default for development but rarely the right answer in production.

@fastify/static →