From Vercel AI Gateway

Stay on Vercel if you want. Just swap the upstream from Vercel's gateway to HiWay.

This guide covers moving a Vercel AI Gateway integration to HiWay2LLM. The Vercel AI SDK itself keeps working — HiWay is OpenAI-compatible, so the @ai-sdk/openai provider (or createOpenAI) plugs straight in. You're only replacing the upstream gateway, not your framework. End-to-end: under 10 minutes.

Prerequisites

A HiWay account and at least one provider key configured in Dashboard → Settings → Providers. You keep your existing Vercel project and the AI SDK — nothing changes on that side.

Step 1: Get your HiWay API key

Open Dashboard → Keys and click New key. Put the hw_live_ value in your Vercel project's environment variables (e.g. HIWAY_API_KEY) and redeploy.

Step 2: Swap your configuration

You were likely using @ai-sdk/openai with baseURL pointing at Vercel's gateway endpoint and an AI_GATEWAY_API_KEY. Swap the baseURL to HiWay and the key to HIWAY_API_KEY. The generateText / streamText / generateObject calls in the rest of your code stay identical.

import { createOpenAI } from "@ai-sdk/openai";
import { generateText } from "ai";

const hiway = createOpenAI({
  baseURL: "https://app.hiway2llm.com/v1",
  apiKey: process.env.HIWAY_API_KEY,
});

const { text } = await generateText({
  model: hiway.chat("auto"),
  prompt: "Write a two-sentence product pitch for a logistics SaaS",
});

console.log(text);

Step 3: Verify

Hit one of your Vercel routes that calls the model, then check Dashboard → Usage on HiWay to confirm the request landed. The _hiway.routed_model field on the response body tells you which model was picked — or inspect the X-HiWay-Routed-Model header if you have access to the raw response.

Gotchas specific to Vercel AI Gateway

  • AI SDK still works: the @ai-sdk/openai provider is the OpenAI-compatible one — HiWay is OpenAI-compatible, so createOpenAI({ baseURL, apiKey }) is the whole integration. No SDK swap.
  • Provider metadata fields may differ: Vercel's gateway exposes providerMetadata with gateway-specific fields. HiWay returns its own _hiway object on the response body and X-HiWay-Routed-Model / X-HiWay-Routed-Tier headers. If you were reading providerMetadata, update those spots.
  • Observability moves out of Vercel: Vercel Gateway logs show up in the Vercel dashboard. HiWay's routing logs and usage timeline live in the HiWay dashboard (Usage, Analytics). Keep Vercel's request logs for the Next.js / edge layer — that part is unchanged.
  • Usage caps move too: if you configured a spend cap on Vercel's gateway, redefine it in HiWay's Dashboard → Budget Control (monthly upstream BYOK cap with three verdicts: DOWNGRADE, LIGHT_ONLY, BLOCK).
  • You keep BYOK: HiWay is BYOK — inference is billed by OpenAI / Anthropic / etc. directly on your card with them, at wholesale. No per-token markup from HiWay.

You're done

Should take less than 10 minutes. Ping support@hiway2llm.com if you're stuck.