Streaming text over HTTP with NextJS

AI chatbots often show you text coming in as it’s being generated.

Your browser makes a request to an API endpoint, and the endpoint streams the text back to you as it becomes available from the AI.

There are lots of tools that handle this kind of thing for you, but I think it’s fun and worthwhile to understand how you can do things yourself.

So today’s small learning objective was to stream Lorem Ipsum text over HTTP.

I used the Vercel AI SDK and NextJS. (and curl to test)

Woah there!

Hey, big shot — I thought you said you were building this yourself.

What’s with using the Vercel AI SDK?

Okay — yes, you’re right. I’m not fully building this myself. (are we ever?)

But my goal was to be able to take any arbitrary text and stream it back to the caller.

For example, I’m interested in running models locally with Ollama and streaming the responses to clients.

Let’s get right to it — here’s the code.

It generates and streams a bunch of Lorem Ipsum text. I added a sleep function so that the text would come in slower — making it more obvious that it was, in fact, streaming and not just coming in all at once.

To test it, you can use curl and add the -N flag (short for --no-buffer) to see the text immediately as it arrives. (otherwise curl will keep it all in a buffer until the connection closes before showing you anything)

My test command was: curl -N http://localhost:3000/lorem

// This is all in a file called: app/lorem/route.ts

import { StreamingTextResponse } from 'ai';

function sleep(ms: number): Promise<void> {
    return new Promise(resolve => setTimeout(resolve, ms));
}

function createLoremIpsumStream(wordCount: number): ReadableStream<string> {
    const loremIpsumText = "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum.";

    let words = loremIpsumText.split(' ');
    let currentIndex = 0;

    return new ReadableStream<string>({
        start(controller) {
            async function push() {
                if (currentIndex < wordCount) {
                    controller.enqueue(words[currentIndex % words.length] + ' ');
                    currentIndex++;
                    await sleep(15);
                    push();
                } else {
                    controller.close();
                }
            }

            push();
        }
    });
}

export async function GET() {
    const count = 100;
    const stream = createLoremIpsumStream(count)

    return new StreamingTextResponse(stream)
}