Generate Both With Code, Not Files
Sitemap & robots.txt
`sitemap.ts` and `robots.ts` files export an object that Next converts to the right XML or text response.
What you'll learn
- Add `app/sitemap.ts` returning a list of URLs
- Add `app/robots.ts`
- Pair both with the Metadata API for full SEO coverage
Static sitemap.xml and robots.txt files are awkward when your site grows past a few
URLs. Next replaces them with TypeScript files that generate the same output at request
time — including dynamic URLs from your DB.
Sitemap
// app/sitemap.ts
import type { MetadataRoute } from 'next'
import { db } from '@/lib/db'
const BASE = 'https://example.com'
export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
const posts = await db.post.findMany({ select: { slug: true, updatedAt: true } })
return [
{ url: BASE, lastModified: new Date(), priority: 1 },
{ url: `${BASE}/about`, priority: 0.7 },
...posts.map((p) => ({
url: `${BASE}/posts/${p.slug}`,
lastModified: p.updatedAt,
priority: 0.5,
})),
]
} Visit /sitemap.xml and Next serves valid XML produced from the array.
robots.txt
// app/robots.ts
import type { MetadataRoute } from 'next'
export default function robots(): MetadataRoute.Robots {
return {
rules: [
{ userAgent: '*', allow: '/' },
{ userAgent: '*', disallow: '/admin' },
],
sitemap: 'https://example.com/sitemap.xml',
}
} Reachable at /robots.txt. Linking the sitemap from robots.txt is what crawlers
expect.
SEO Together
These two files plus the Metadata API cover the SEO basics: discoverable URLs, descriptions, social previews. The rest is content.
Internationalization →