glove-next

API reference for the Next.js integration package. Provides a server-side handler that turns any Next.js App Router API route into a streaming chat endpoint compatible with glove-react.

createChatHandler

Factory function that returns a Next.js App Router POST handler. The handler accepts incoming chat requests, forwards them to the configured language model, and streams the response back as Server-Sent Events.

The appropriate SDK (openai or @anthropic-ai/sdk) is lazy-loaded based on the provider's format, so only the SDK you use needs to be installed.

app/api/chat/route.tstypescript
import { createChatHandler } from "glove-next";

export const POST = createChatHandler({
  provider: "anthropic",
  model: "claude-sonnet-4-20250514",
  maxTokens: 4096,
});

Signature

typescript
function createChatHandler(
  config: ChatHandlerConfig
): (req: Request) => Promise<Response>

Returns a function with the signature (req: Request) => Promise<Response>, which is the shape Next.js expects for route handlers.

ChatHandlerConfig

PropertyTypeDescription
providerstringThe provider ID. Required. One of: "openai", "anthropic", "openrouter", "gemini", "minimax", "kimi", "glm".
model?stringThe model name to use. Defaults to the provider's default model (e.g., "gpt-4o" for openai, "claude-sonnet-4-20250514" for anthropic).
apiKey?stringAPI key for the provider. Defaults to the provider's environment variable (see Environment Variables below).
maxTokens?numberMaximum number of output tokens per response. Defaults to the provider's default max tokens.

Supported Providers

Any provider registered in glove-core can be used. Each provider maps to an SDK format (either OpenAI-compatible or Anthropic-compatible), which determines which SDK is loaded at runtime.

ProviderSDK FormatDefault Model
openaiopenaigpt-4o
anthropicanthropicclaude-sonnet-4-20250514
openrouteropenaiopenai/gpt-4o
geminiopenaigemini-2.0-flash
minimaxopenaiMiniMax-Text-01
kimiopenaimoonshot-v1-auto
glmopenaiglm-4-plus

Providers with openai format use the openai npm package. Providers with anthropic format use @anthropic-ai/sdk. Install only the SDK you need as a peer dependency.

Environment Variables

Each provider reads its API key from a default environment variable. Override with the apiKey config option if needed.

ProviderEnvironment VariableDescription
openaiOPENAI_API_KEYOpenAI API key.
anthropicANTHROPIC_API_KEYAnthropic API key.
openrouterOPENROUTER_API_KEYOpenRouter API key.
geminiGEMINI_API_KEYGoogle Gemini API key.
minimaxMINIMAX_API_KEYMiniMax API key.
kimiMOONSHOT_API_KEYMoonshot (Kimi) API key.
glmZHIPUAI_API_KEYZhipuAI (GLM) API key.

Request Format

The handler expects a JSON POST body matching the RemotePromptRequest shape. This is what glove-react's createEndpointModel sends automatically.

RemotePromptRequest

PropertyTypeDescription
systemPromptstringThe system prompt for this request.
messagesMessage[]The conversation history.
tools?SerializedTool[]Tool definitions serialized as JSON Schema objects.

SerializedTool

PropertyTypeDescription
namestringThe tool name.
descriptionstringThe tool description.
parametersRecord<string, unknown>JSON Schema representation of the tool's input parameters.

SSE Response Protocol

The handler streams responses back as Server-Sent Events. Each event is a RemoteStreamEvent, sent as a JSON-encoded data: line. The client-side parseSSEStream utility in glove-react deserializes these events automatically.

RemoteStreamEvent

typescript
type RemoteStreamEvent =
  | { type: "text_delta"; text: string }
  | { type: "tool_use"; id: string; name: string; input: unknown }
  | { type: "done"; message: Message; tokens_in: number; tokens_out: number };
Event TypeFieldsDescription
text_deltatext: stringA chunk of streaming text from the model. Sent as the model generates tokens.
tool_useid: string, name: string, input: unknownThe model wants to invoke a tool. Contains the call ID, tool name, and input arguments.
donemessage: Message, tokens_in: number, tokens_out: numberThe stream is complete. Contains the final Message object and token usage counts.

The raw SSE wire format looks like this:

text
data: {"type":"text_delta","text":"Hello"}

data: {"type":"text_delta","text":", how"}

data: {"type":"text_delta","text":" can I help?"}

data: {"type":"done","message":{"sender":"agent","text":"Hello, how can I help?"},"tokens_in":42,"tokens_out":8}

Full Working Example

A complete setup with glove-next on the server and glove-react on the client.

Server: API Route

app/api/chat/route.tstypescript
import { createChatHandler } from "glove-next";

export const POST = createChatHandler({
  provider: "openai",
  model: "gpt-4o",
});

Client: GloveClient Setup

lib/glove.tstypescript
import { GloveClient } from "glove-react";
import { z } from "zod";

export const gloveClient = new GloveClient({
  endpoint: "/api/chat",
  systemPrompt: "You are a helpful coding assistant.",
  tools: [
    {
      name: "search_docs",
      description: "Search the documentation for a query.",
      inputSchema: z.object({
        query: z.string().describe("The search query."),
      }),
      async do(input) {
        const res = await fetch(`/api/docs/search?q=${encodeURIComponent(input.query)}`);
        return res.json();
      },
    },
  ],
  compaction: {
    compaction_instructions: "Summarize the conversation, preserving any code snippets discussed.",
    max_turns: 40,
  },
});

Client: Provider and Chat Component

app/providers.tsxtsx
"use client";

import { GloveProvider } from "glove-react";
import { gloveClient } from "@/lib/glove";

export function Providers({ children }: { children: React.ReactNode }) {
  return <GloveProvider client={gloveClient}>{children}</GloveProvider>;
}
app/chat.tsxtsx
"use client";

import { useGlove } from "glove-react";
import { useRef, FormEvent } from "react";

export default function Chat() {
  const { timeline, streamingText, busy, sendMessage } = useGlove();
  const inputRef = useRef<HTMLInputElement>(null);

  function handleSubmit(e: FormEvent) {
    e.preventDefault();
    const text = inputRef.current?.value?.trim();
    if (!text || busy) return;
    sendMessage(text);
    if (inputRef.current) inputRef.current.value = "";
  }

  return (
    <div>
      <div>
        {timeline.map((entry, i) => (
          <div key={i}>
            {entry.kind === "user" && <p><strong>You:</strong> {entry.text}</p>}
            {entry.kind === "agent_text" && <p><strong>Agent:</strong> {entry.text}</p>}
            {entry.kind === "tool" && (
              <p><em>Tool: {entry.name} ({entry.status})</em></p>
            )}
          </div>
        ))}
        {streamingText && <p><strong>Agent:</strong> {streamingText}</p>}
      </div>
      <form onSubmit={handleSubmit}>
        <input ref={inputRef} disabled={busy} placeholder="Type a message..." />
        <button type="submit" disabled={busy}>Send</button>
      </form>
    </div>
  );
}

Environment

.env.localtext
OPENAI_API_KEY=sk-your-api-key-here

The createEndpointModel adapter inside GloveClient handles SSE parsing, streaming text aggregation, and tool call deserialization. The createChatHandler on the server handles SDK initialization, request translation, and SSE encoding. Together they form a complete client-server pipeline with no manual plumbing required.