Accept Files Safely — Then Send Them Elsewhere
File Uploads with Multer
Multer parses multipart/form-data. Limit size, validate type, and push to object storage.
What you'll learn
- Accept a file upload
- Validate size and mime
- Stream to S3
Browsers send file uploads as multipart/form-data. Express doesn’t
parse that natively — use Multer.
Install
npm install multer Memory Storage
For files you’ll push elsewhere (S3, image processing):
import multer from "multer";
const upload = multer({
storage: multer.memoryStorage(),
limits: { fileSize: 5 * 1024 * 1024 }, // 5MB cap
fileFilter: (req, file, cb) => {
if (!file.mimetype.startsWith("image/")) {
return cb(new Error("only images allowed"));
}
cb(null, true);
},
});
app.post("/avatar", upload.single("file"), (req, res) => {
console.log(req.file);
// { fieldname, originalname, mimetype, size, buffer }
res.json({ size: req.file.size });
}); Disk Storage
For larger files or local processing:
const upload = multer({
dest: "./uploads/",
limits: { fileSize: 100 * 1024 * 1024 },
});
app.post("/upload", upload.single("file"), (req, res) => {
// req.file.path is the saved path
}); Multiple Files
// up to 10 files in field "photos"
app.post("/photos", upload.array("photos", 10), (req, res) => {
console.log(req.files); // array
});
// mixed fields
const fields = upload.fields([
{ name: "avatar", maxCount: 1 },
{ name: "gallery", maxCount: 5 },
]);
app.post("/profile", fields, (req, res) => {
console.log(req.files.avatar);
console.log(req.files.gallery);
}); Push to S3
The right pattern for production:
import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
const s3 = new S3Client({ region: "us-east-1" });
app.post("/avatar", upload.single("file"), async (req, res) => {
const key = `avatars/${req.user.id}/${Date.now()}-${req.file.originalname}`;
await s3.send(new PutObjectCommand({
Bucket: process.env.S3_BUCKET,
Key: key,
Body: req.file.buffer,
ContentType: req.file.mimetype,
}));
res.json({ url: `https://cdn.example.com/${key}` });
}); Don’t store user uploads on the Node server — it doesn’t scale horizontally and risks filling the disk.
Presigned URLs (Better for Big Files)
For files > 10MB, skip the Node middle-man entirely:
import { getSignedUrl } from "@aws-sdk/s3-request-presigner";
app.post("/uploads/url", async (req, res) => {
const key = `${req.user.id}/${req.body.filename}`;
const url = await getSignedUrl(s3, new PutObjectCommand({
Bucket: process.env.S3_BUCKET,
Key: key,
ContentType: req.body.contentType,
}), { expiresIn: 300 });
res.json({ url, key });
}); Frontend PUTs directly to S3 using url. Node never touches the
bytes — handles thousands of concurrent uploads without breaking
a sweat.
Validate Real Content
file.mimetype is set by the client — not trustworthy. Sniff actual
content with file-type:
import { fileTypeFromBuffer } from "file-type";
const real = await fileTypeFromBuffer(req.file.buffer);
if (!real || !real.mime.startsWith("image/")) {
return res.status(400).json({ error: { code: "not_an_image" } });
}