File Uploads

Accept Uploads — and Store Them Somewhere Sensible

File Uploads

Use multer for multipart parsing, then store to S3 (or similar) — not on local disk.

4 min read Level 2/5 #nodejs#files#upload
What you'll learn
  • Accept a file upload via Express
  • Stream to S3-compatible storage
  • Validate size and mime-type

Browsers POST files as multipart/form-data. Node’s stdlib doesn’t parse it — you need middleware.

Multer

npm install multer
import multer from "multer";
import express from "express";

const upload = multer({
  storage: multer.memoryStorage(),
  limits: { fileSize: 5 * 1024 * 1024 },   // 5MB cap
  fileFilter: (req, file, cb) => {
    if (!file.mimetype.startsWith("image/")) {
      return cb(new Error("Only images allowed"));
    }
    cb(null, true);
  },
});

const app = express();

app.post("/avatar", upload.single("file"), (req, res) => {
  console.log(req.file);
  // { fieldname, originalname, mimetype, size, buffer }
  res.json({ size: req.file.size, mime: req.file.mimetype });
});

Memory vs Disk Storage

StorageWhen
multer.memoryStorage()Small files; you’ll upload to S3/cloud immediately
multer.diskStorage()Local processing, large files

Memory storage is faster but limits you to whatever fits in RAM.

Always Validate

  • Size: enforce limits or you’ll OOM (multer’s limits.fileSize)
  • MIME: don’t trust file.mimetype blindly — it comes from the client. For security-sensitive flows, sniff the actual bytes with a lib like file-type.

Store Off-Server

Don’t save user uploads to the Node server’s disk. It doesn’t scale (ephemeral containers, multiple instances) and risks filling the disk.

Use object storage:

npm install @aws-sdk/client-s3
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";

const s3 = new S3Client({ region: "us-east-1" });

app.post("/avatar", upload.single("file"), async (req, res) => {
  await s3.send(new PutObjectCommand({
    Bucket: "my-app-uploads",
    Key:    `avatars/${req.user.id}/${Date.now()}.png`,
    Body:   req.file.buffer,
    ContentType: req.file.mimetype,
  }));
  res.json({ ok: true });
});

Works with AWS S3, Cloudflare R2, Backblaze B2 — any S3-compatible provider.

Presigned URLs — The Better Pattern

For large files, skip the Node middleman: have the client upload directly to S3 with a presigned URL.

import { getSignedUrl } from "@aws-sdk/s3-request-presigner";

app.post("/upload-url", async (req, res) => {
  const url = await getSignedUrl(s3, new PutObjectCommand({
    Bucket: "my-app-uploads",
    Key: `${req.user.id}/${req.body.filename}`,
  }), { expiresIn: 300 });
  res.json({ url });
});

Client gets url, PUTs the file directly. Node never touches the bytes — your servers stay snappy regardless of file size.

Job Queues →