Cloudflare Workers, Vercel Edge, Deno Deploy
Deploying to the Edge
Nitro presets compile your Nuxt server for edge runtimes — global low latency, no cold starts, but a smaller API surface than Node.
What you'll learn
- Pick the right edge preset for your host
- Account for runtime limitations like no fs and limited Node APIs
- Use edge-compatible database drivers
Edge runtimes run your server on hundreds of POPs worldwide. Requests hit the nearest one — TTFB of 30ms in Tokyo, Sydney, and São Paulo for the same code. The trade-off: you’re not in Node anymore.
Pick a Preset
Set nitro.preset to match your host.
export default defineNuxtConfig({
nitro: {
preset: 'cloudflare-pages',
// 'cloudflare-module' — for Workers with a wrangler.toml
// 'vercel-edge' — for Vercel Edge Functions
// 'deno-deploy' — for Deno Deploy
// 'netlify-edge' — for Netlify Edge Functions
},
}) Build and deploy:
nuxi build
# Cloudflare Pages
npx wrangler pages deploy dist What’s Missing
Edge runtimes ship a subset of Node — basically the Web Standards API.
Available: fetch, Request, Response, URL, crypto.subtle, TextEncoder,
ReadableStream. Missing: fs, path, child_process, net, and most npm
packages that wrap them.
The build will fail if any imported package uses a missing API. Move that work to a separate Node service or skip the package.
Edge-Friendly Databases
Postgres pg opens a TCP socket — not allowed at the edge. Use HTTP-based
drivers instead.
- Neon — serverless Postgres with an HTTP driver
- Turso / libSQL — SQLite over HTTP
- Cloudflare D1 — built-in SQLite, native to Workers
- Drizzle ORM — has HTTP adapters for all three
import { neon } from '@neondatabase/serverless'
const sql = neon(process.env.DATABASE_URL!)
const users = await sql`SELECT * FROM users LIMIT 10` Env & Secrets
Each host has its own way to bind secrets — wrangler secret put for
Cloudflare, the Vercel dashboard for Edge Functions. Access them through
useRuntimeConfig() exactly as you would in Node.
When to Choose Edge
Pick edge when:
- You need globally low latency on read-heavy routes
- Your code only does
fetch, JSON, and JWT validation - Cold starts on Node serverless are hurting you
Stay on Node when:
- You need filesystem access, native modules, or long-running background jobs
- Your DB needs persistent TCP connections without an HTTP proxy
- You hit edge memory or CPU limits (~128MB, ~50ms CPU on Cloudflare free)
That’s the production tour — go ship something.