A Connection Pool, Reused Across Requests
Database Integration
Connect to a real database (Postgres) from Express. Pool, don't reconnect per request.
What you'll learn
- Configure a pool
- Wire it into Express
- Test the connection on boot
Express isn’t a database tool — it’s HTTP. You bring a database
client and wire it in. Postgres + pg is the standard.
The Pool Pattern
A pool keeps a small number of DB connections open and reuses them across requests. Don’t open a new connection per request — opening one takes ~10ms and exhausts DB resources fast.
npm install pg // src/db/client.js
import pg from "pg";
const { Pool } = pg;
export const pool = new Pool({
connectionString: process.env.DATABASE_URL,
max: 10,
});
pool.on("error", (err) => {
console.error("pg pool error", err);
}); Import this from wherever you need DB access.
A Health Check on Boot
Catch misconfigurations at startup, not under traffic:
// src/index.js
import { buildApp } from "./app.js";
import { pool } from "./db/client.js";
import { env } from "./config/env.js";
try {
await pool.query("SELECT 1");
console.log("db ok");
} catch (err) {
console.error("db not reachable", err);
process.exit(1);
}
const app = buildApp();
app.listen(env.PORT, () => console.log(`up on :${env.PORT}`)); Use in a Handler
// src/controllers/users.js
import { pool } from "../db/client.js";
export async function get(req, res) {
const { rows } = await pool.query(
"SELECT id, email, name FROM users WHERE id = $1",
[req.validParams.id]
);
if (!rows.length) {
return res.status(404).json({ error: { code: "user_not_found" } });
}
res.json({ data: rows[0] });
} Transactions
For multi-step writes, get a dedicated client:
export async function transfer(req, res, next) {
const client = await pool.connect();
try {
await client.query("BEGIN");
await client.query("UPDATE accounts SET balance = balance - $1 WHERE id = $2", [amount, fromId]);
await client.query("UPDATE accounts SET balance = balance + $1 WHERE id = $2", [amount, toId]);
await client.query("COMMIT");
res.json({ ok: true });
} catch (err) {
await client.query("ROLLBACK");
next(err);
} finally {
client.release();
}
} client.release() returns the connection to the pool. Always in
a finally — otherwise a thrown query leaks a connection forever.
Pool Size
max: 10 per process is a reasonable default. Don’t go too high:
- Postgres has a connection limit (typically 100)
- 4 instances × 50 connections each = 200 = exhausted DB
Coordinate pool size across instances. PgBouncer (a connection pooler in front of Postgres) helps at scale.
Graceful Shutdown
On SIGTERM, close the pool:
process.on("SIGTERM", async () => {
await pool.end();
process.exit(0);
}); Don’t leave half-open connections.
Using an ORM →