python

Creating Multi-Stage Builds with NestJS: Reducing Build Time and Size

Multi-stage builds in NestJS optimize Docker images, reducing size and build times. They separate build and production stages, include only necessary files, and leverage caching for faster incremental builds.

Creating Multi-Stage Builds with NestJS: Reducing Build Time and Size

Alright, let’s dive into the world of multi-stage builds with NestJS. As a developer who’s been around the block a few times, I can tell you that optimizing your build process is like finding a secret shortcut on your daily commute – it might not seem like much at first, but boy, does it make a difference in the long run!

When I first started working with NestJS, I was blown away by its powerful features and modular architecture. But as my projects grew in complexity, I noticed that my build times were starting to creep up, and my Docker images were getting a bit… well, chunky. That’s when I discovered the magic of multi-stage builds.

Multi-stage builds are like a Marie Kondo approach to your Docker images – they help you keep only what sparks joy (or in this case, what’s absolutely necessary for your app to run). The basic idea is to use multiple stages in your Dockerfile, where each stage builds upon the previous one, but only carries forward what’s essential.

Let’s start with a simple example. Here’s what a basic Dockerfile for a NestJS app might look like:

FROM node:14

WORKDIR /app

COPY package*.json ./
RUN npm install

COPY . .
RUN npm run build

CMD ["npm", "run", "start:prod"]

This works, but it’s not very efficient. We’re carrying around all our dev dependencies and source code in the final image. Let’s see how we can improve this with a multi-stage build:

# Build stage
FROM node:14 AS builder

WORKDIR /app

COPY package*.json ./
RUN npm install

COPY . .
RUN npm run build

# Production stage
FROM node:14-alpine

WORKDIR /app

COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules

CMD ["node", "dist/main"]

Now we’re cooking with gas! This Dockerfile uses two stages: a build stage and a production stage. The build stage does all the heavy lifting – installing dependencies, copying the source code, and building the app. The production stage then copies only the built assets and production dependencies from the build stage.

The result? A much slimmer final image that contains only what’s needed to run your app in production. It’s like going from a bulky winter coat to a sleek windbreaker – you’re still protected from the elements, but you can move a lot faster!

But wait, there’s more! We can take this a step further by using a specific build image for NestJS. The NestJS CLI provides a nest build command that can create a standalone application, which can run without node_modules. Here’s how we can leverage that:

# Build stage
FROM node:14 AS builder

WORKDIR /app

COPY package*.json ./
RUN npm install

COPY . .
RUN npm run build
RUN npm run build:prod

# Production stage
FROM node:14-alpine

WORKDIR /app

COPY --from=builder /app/dist ./dist

CMD ["node", "dist/main"]

In this version, we’re using npm run build:prod (which you’d set up in your package.json to run nest build --webpack) to create a standalone application. This means our production stage doesn’t even need the node_modules directory!

Now, you might be thinking, “That’s all well and good, but what about my development workflow?” Great question! One of the things I love about multi-stage builds is that they play nicely with development environments too. You can use the same Dockerfile for both production and development by leveraging build arguments:

# Build stage
FROM node:14 AS builder

WORKDIR /app

COPY package*.json ./
RUN npm install

COPY . .
RUN npm run build

# Development stage
FROM builder AS development

CMD ["npm", "run", "start:dev"]

# Production stage
FROM node:14-alpine AS production

WORKDIR /app

COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules

CMD ["node", "dist/main"]

With this setup, you can build for development with docker build --target development -t myapp:dev . and for production with docker build --target production -t myapp:prod .. It’s like having your cake and eating it too!

But what about build time, you ask? Multi-stage builds can help there too. By leveraging Docker’s build cache effectively, you can significantly reduce build times for incremental changes. Here’s a pro tip: structure your Dockerfile to copy and install dependencies before copying your source code. This way, if your dependencies haven’t changed, Docker can use the cached layer:

FROM node:14 AS builder

WORKDIR /app

COPY package*.json ./
RUN npm install

# Copy source code after installing dependencies
COPY . .
RUN npm run build

This simple change can shave minutes off your build time when you’re iterating quickly on your code.

Now, let’s talk about some real-world optimizations. In one project I worked on, we were dealing with a monorepo structure where our NestJS app was just one part of a larger ecosystem. We used Lerna to manage our packages, which added an extra layer of complexity to our builds.

To tackle this, we created a custom build script that only built the packages our NestJS app depended on, rather than building the entire monorepo. We then used this script in our Dockerfile:

FROM node:14 AS builder

WORKDIR /app

COPY package*.json ./
COPY lerna.json ./
RUN npm install

COPY . .
RUN npm run build:nest-app

# ... rest of the Dockerfile

This approach, combined with careful use of .dockerignore to exclude unnecessary files, cut our build times by almost 40%!

Another technique I’ve found useful is to use build arguments to control what gets included in the final image. For example, you might want to include some debugging tools in a staging environment but not in production. Here’s how you can do that:

ARG NODE_ENV=production

FROM node:14-alpine

WORKDIR /app

COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules

RUN if [ "$NODE_ENV" = "staging" ] ; then npm install --only=production some-debug-tool ; fi

CMD ["node", "dist/main"]

You can then build your staging image with docker build --build-arg NODE_ENV=staging -t myapp:staging ..

One last tip: don’t forget about your npm scripts! They can be a powerful tool in your build optimization arsenal. For example, you might have a script that generates your API documentation. In development, you might want to regenerate this on every build, but for production, you could generate it once and copy it into your image:

FROM node:14 AS builder

WORKDIR /app

COPY package*.json ./
RUN npm install

COPY . .
RUN npm run build
RUN npm run generate-docs

FROM node:14-alpine AS production

WORKDIR /app

COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
COPY --from=builder /app/docs ./docs

CMD ["node", "dist/main"]

In conclusion, multi-stage builds in NestJS are like a Swiss Army knife for your Docker images – they help you create lean, mean, production-ready machines while still maintaining a flexible development environment. By carefully structuring your Dockerfile, leveraging build cache, and using techniques like build arguments and custom npm scripts, you can significantly reduce both your build times and final image sizes.

Remember, optimizing your build process is an ongoing journey. As your app evolves, so too should your build strategy. Keep experimenting, measuring, and refining. And most importantly, have fun with it! After all, there’s something deeply satisfying about watching those build times drop and image sizes shrink. Happy building!

Keywords: NestJS, Docker, multi-stage builds, optimization, containerization, DevOps, Node.js, microservices, CI/CD, performance



Similar Posts
Blog Image
How to Tame Any API Response with Marshmallow: Advanced Deserialization Techniques

Marshmallow simplifies API response handling in Python, offering easy deserialization, nested schemas, custom validation, and advanced features like method fields and pre-processing hooks. It's a powerful tool for taming complex data structures.

Blog Image
Supercharge Your API: FastAPI and Tortoise-ORM for NoSQL Databases

FastAPI with Tortoise-ORM enhances API performance for NoSQL databases. Async operations, flexible schemas, and efficient querying enable scalable, high-speed APIs. Leverage NoSQL strengths for optimal results.

Blog Image
Is FastAPI the Ultimate Swiss Army Knife for Python Web APIs?

Crafting APIs with FastAPI: The Perfect Blend of Efficiency and Developer Joy

Blog Image
Automatic Schema Generation: Unlocking Marshmallow’s Potential with Python Dataclasses

Automatic schema generation using Marshmallow and Python dataclasses simplifies data serialization and deserialization. It improves code maintainability, reduces errors, and handles complex structures efficiently. This approach streamlines development and enhances data validation capabilities.

Blog Image
**Master Python NLP Libraries: Essential Tools for Natural Language Processing in 2024**

Master Python NLP with 6 essential libraries: NLTK, spaCy, Gensim, TextBlob, Transformers & Stanza. Learn practical code examples and choose the right tool for your project.

Blog Image
How Can FastAPI and WebSockets Transform Your Real-Time Applications?

Building Dynamic Real-Time Apps: FastAPI and WebSockets Unleashed