Streaming Data With async iterators

Stream Long Responses Without Buffering

Streaming Data With async iterators

Route Handlers can return a streaming Response — useful for AI tokens, long server work, or real-time updates.

5 min read Level 3/5 #nextjs#streaming#edge
What you'll learn
  • Return a `ReadableStream` from a Route Handler
  • Build the stream from an async iterator
  • Consume it with the Streams API on the client

A streaming Route Handler flushes bytes as soon as they are available instead of buffering the whole response. That is exactly the model an LLM token stream needs.

A Basic Stream

// app/api/stream/route.ts
export const runtime = 'edge'

export async function GET() {
  const encoder = new TextEncoder()
  const stream = new ReadableStream({
    async start(controller) {
      for (const word of ['hello', ' ', 'world']) {
        controller.enqueue(encoder.encode(word))
        await new Promise((r) => setTimeout(r, 200))
      }
      controller.close()
    },
  })

  return new Response(stream, {
    headers: { 'Content-Type': 'text/plain; charset=utf-8' },
  })
}

The browser sees “hello”, ” ”, and “world” arrive separately — not after a 600 ms wait.

From an Async Iterator

async function* tokens() {
  for (const t of ['one', 'two', 'three']) {
    yield t
    await new Promise((r) => setTimeout(r, 100))
  }
}

export async function GET() {
  const encoder = new TextEncoder()
  const stream = new ReadableStream({
    async start(controller) {
      for await (const t of tokens()) controller.enqueue(encoder.encode(t))
      controller.close()
    },
  })
  return new Response(stream)
}

Anything iterable works — DB cursors, file reads, LLM token streams.

Consume on the Client

'use client'

export function Streamer() {
  async function run() {
    const res = await fetch('/api/stream')
    const reader = res.body!.getReader()
    const decoder = new TextDecoder()
    while (true) {
      const { value, done } = await reader.read()
      if (done) break
      console.log(decoder.decode(value))
    }
  }
  return <button onClick={run}>Stream</button>
}

For AI apps, pair this with Vercel’s AI SDK — it does the encoding, parsing, and React state updates for you.

next/image — Optimized Images →