Background Jobs With BullMQ

Off-Load Slow Work to a Worker

Background Jobs With BullMQ

BullMQ on top of Redis is the canonical Node queue. Wrap the producer Queue and the consumer Worker in plugins so they share a lifecycle with the app.

4 min read Level 3/5 #fastify#queues#bullmq
What you'll learn
  • Install bullmq
  • Create a Queue plugin for producers
  • Run a Worker process for consumers

A fast HTTP API should not be sending emails or running PDF generation in the request lifecycle. Move slow or unreliable work to a queue, and let a worker process pick it up.

Install

npm install bullmq ioredis

Producer Plugin

Decorate the Fastify app with a Queue so handlers can enqueue jobs.

import fp from 'fastify-plugin'
import { Queue } from 'bullmq'

declare module 'fastify' {
  interface FastifyInstance {
    emailQueue: Queue<{ to: string; subject: string }>
  }
}

export default fp(async (app) => {
  const connection = { url: app.config.REDIS_URL }
  const queue = new Queue('emails', { connection })
  app.decorate('emailQueue', queue)
  app.addHook('onClose', async () => queue.close())
})

Enqueue from a handler:

app.post<{ Body: { email: string } }>('/signup', async (req, reply) => {
  await app.db.user.create({ data: { email: req.body.email } })
  await app.emailQueue.add('welcome', { to: req.body.email, subject: 'Welcome' })
  return reply.code(202).send()
})

Worker Process

Workers run in their own process so a CPU-heavy job does not stall the web server.

// worker.ts
import { Worker } from 'bullmq'

const connection = { url: process.env.REDIS_URL! }

new Worker(
  'emails',
  async (job) => {
    await sendEmail(job.data.to, job.data.subject)
  },
  { connection, concurrency: 10 },
)

Deploy the worker as a separate service (or replica set), sharing the same Redis. Use BullMQ’s retry and backoff options for resilience: attempts: 5, backoff: { type: 'exponential', delay: 1000 }.

OpenTelemetry With Fastify →