Vercel AI SDK

Streaming chat UIs with HiWay as the provider.

The Vercel AI SDK ships an OpenAI-compatible provider you can retarget. Stream responses, use the React hooks, keep the SSE protocol — HiWay proxies all of it.

app/api/chat/route.ts
import { createOpenAI } from "@ai-sdk/openai";
import { streamText } from "ai";

const hiway = createOpenAI({
  baseURL: "https://www.hiway2llm.com/v1",
  apiKey: process.env.HIWAY_API_KEY,
});

export async function POST(req: Request) {
  const { messages } = await req.json();
  const result = streamText({
    model: hiway("auto"),
    messages,
  });
  return result.toDataStreamResponse();
}

Streaming works end-to-end

HiWay supports SSE streaming natively. The first token latency is typically 80-250ms depending on the routed tier — routing adds less than 5ms on top.