API reference for the Next.js integration package. Provides a server-side handler that turns any Next.js App Router API route into a streaming chat endpoint compatible with glove-react.
Factory function that returns a Next.js App Router POST handler. The handler accepts incoming chat requests, forwards them to the configured language model, and streams the response back as Server-Sent Events.
The appropriate SDK (openai or @anthropic-ai/sdk) is lazy-loaded based on the provider's format, so only the SDK you use needs to be installed.
import { createChatHandler } from "glove-next";
export const POST = createChatHandler({
provider: "anthropic",
model: "claude-sonnet-4-20250514",
maxTokens: 4096,
});function createChatHandler(
config: ChatHandlerConfig
): (req: Request) => Promise<Response>Returns a function with the signature (req: Request) => Promise<Response>, which is the shape Next.js expects for route handlers.
| Property | Type | Description |
|---|---|---|
| provider | string | The provider ID. Required. One of: "openai", "anthropic", "openrouter", "gemini", "minimax", "kimi", "glm". |
| model? | string | The model name to use. Defaults to the provider's default model (e.g., "gpt-4o" for openai, "claude-sonnet-4-20250514" for anthropic). |
| apiKey? | string | API key for the provider. Defaults to the provider's environment variable (see Environment Variables below). |
| maxTokens? | number | Maximum number of output tokens per response. Defaults to the provider's default max tokens. |
Any provider registered in glove-core can be used. Each provider maps to an SDK format (either OpenAI-compatible or Anthropic-compatible), which determines which SDK is loaded at runtime.
| Provider | SDK Format | Default Model |
|---|---|---|
| openai | openai | gpt-4o |
| anthropic | anthropic | claude-sonnet-4-20250514 |
| openrouter | openai | openai/gpt-4o |
| gemini | openai | gemini-2.0-flash |
| minimax | openai | MiniMax-Text-01 |
| kimi | openai | moonshot-v1-auto |
| glm | openai | glm-4-plus |
Providers with openai format use the openai npm package. Providers with anthropic format use @anthropic-ai/sdk. Install only the SDK you need as a peer dependency.
Each provider reads its API key from a default environment variable. Override with the apiKey config option if needed.
| Provider | Environment Variable | Description |
|---|---|---|
| openai | OPENAI_API_KEY | OpenAI API key. |
| anthropic | ANTHROPIC_API_KEY | Anthropic API key. |
| openrouter | OPENROUTER_API_KEY | OpenRouter API key. |
| gemini | GEMINI_API_KEY | Google Gemini API key. |
| minimax | MINIMAX_API_KEY | MiniMax API key. |
| kimi | MOONSHOT_API_KEY | Moonshot (Kimi) API key. |
| glm | ZHIPUAI_API_KEY | ZhipuAI (GLM) API key. |
The handler expects a JSON POST body matching the RemotePromptRequest shape. This is what glove-react's createEndpointModel sends automatically.
| Property | Type | Description |
|---|---|---|
| systemPrompt | string | The system prompt for this request. |
| messages | Message[] | The conversation history. |
| tools? | SerializedTool[] | Tool definitions serialized as JSON Schema objects. |
| Property | Type | Description |
|---|---|---|
| name | string | The tool name. |
| description | string | The tool description. |
| parameters | Record<string, unknown> | JSON Schema representation of the tool's input parameters. |
The handler streams responses back as Server-Sent Events. Each event is a RemoteStreamEvent, sent as a JSON-encoded data: line. The client-side parseSSEStream utility in glove-react deserializes these events automatically.
type RemoteStreamEvent =
| { type: "text_delta"; text: string }
| { type: "tool_use"; id: string; name: string; input: unknown }
| { type: "done"; message: Message; tokens_in: number; tokens_out: number };| Event Type | Fields | Description |
|---|---|---|
| text_delta | text: string | A chunk of streaming text from the model. Sent as the model generates tokens. |
| tool_use | id: string, name: string, input: unknown | The model wants to invoke a tool. Contains the call ID, tool name, and input arguments. |
| done | message: Message, tokens_in: number, tokens_out: number | The stream is complete. Contains the final Message object and token usage counts. |
The raw SSE wire format looks like this:
data: {"type":"text_delta","text":"Hello"}
data: {"type":"text_delta","text":", how"}
data: {"type":"text_delta","text":" can I help?"}
data: {"type":"done","message":{"sender":"agent","text":"Hello, how can I help?"},"tokens_in":42,"tokens_out":8}A complete setup with glove-next on the server and glove-react on the client.
import { createChatHandler } from "glove-next";
export const POST = createChatHandler({
provider: "openai",
model: "gpt-4o",
});import { GloveClient } from "glove-react";
import { z } from "zod";
export const gloveClient = new GloveClient({
endpoint: "/api/chat",
systemPrompt: "You are a helpful coding assistant.",
tools: [
{
name: "search_docs",
description: "Search the documentation for a query.",
inputSchema: z.object({
query: z.string().describe("The search query."),
}),
async do(input) {
const res = await fetch(`/api/docs/search?q=${encodeURIComponent(input.query)}`);
return res.json();
},
},
],
compaction: {
compaction_instructions: "Summarize the conversation, preserving any code snippets discussed.",
max_turns: 40,
},
});"use client";
import { GloveProvider } from "glove-react";
import { gloveClient } from "@/lib/glove";
export function Providers({ children }: { children: React.ReactNode }) {
return <GloveProvider client={gloveClient}>{children}</GloveProvider>;
}"use client";
import { useGlove } from "glove-react";
import { useRef, FormEvent } from "react";
export default function Chat() {
const { timeline, streamingText, busy, sendMessage } = useGlove();
const inputRef = useRef<HTMLInputElement>(null);
function handleSubmit(e: FormEvent) {
e.preventDefault();
const text = inputRef.current?.value?.trim();
if (!text || busy) return;
sendMessage(text);
if (inputRef.current) inputRef.current.value = "";
}
return (
<div>
<div>
{timeline.map((entry, i) => (
<div key={i}>
{entry.kind === "user" && <p><strong>You:</strong> {entry.text}</p>}
{entry.kind === "agent_text" && <p><strong>Agent:</strong> {entry.text}</p>}
{entry.kind === "tool" && (
<p><em>Tool: {entry.name} ({entry.status})</em></p>
)}
</div>
))}
{streamingText && <p><strong>Agent:</strong> {streamingText}</p>}
</div>
<form onSubmit={handleSubmit}>
<input ref={inputRef} disabled={busy} placeholder="Type a message..." />
<button type="submit" disabled={busy}>Send</button>
</form>
</div>
);
}OPENAI_API_KEY=sk-your-api-key-hereThe createEndpointModel adapter inside GloveClient handles SSE parsing, streaming text aggregation, and tool call deserialization. The createChatHandler on the server handles SDK initialization, request translation, and SSE encoding. Together they form a complete client-server pipeline with no manual plumbing required.