Drop-in with your existing SDK

OpenAI, Anthropic, LangChain, Vercel AI SDK, n8n — change one line.

HiWay implements the standard OpenAI chat-completions API. You don't have to install anything new — keep the SDK you already use and just override the base URL.

OpenAI Python

python
from openai import OpenAI

client = OpenAI(
    base_url="https://app.hiway2llm.com/v1",
    api_key="hw_live_YOUR_KEY",
)

OpenAI Node

typescript
import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "https://app.hiway2llm.com/v1",
  apiKey: process.env.HIWAY_API_KEY,
});

LangChain

python
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://app.hiway2llm.com/v1",
    api_key="hw_live_YOUR_KEY",
    model="auto",
)

Vercel AI SDK

typescript
import { createOpenAI } from "@ai-sdk/openai";

const hiway = createOpenAI({
  baseURL: "https://app.hiway2llm.com/v1",
  apiKey: process.env.HIWAY_API_KEY,
});

const result = await hiway.chat("auto").doGenerate({...});

n8n

Use the OpenAI Chat Model node. In the credential config, set *Base URL* to https://app.hiway2llm.com/v1 and paste your hw_live_ key as the API key. Every AI node in your workflow now benefits from smart routing, Guardian and provider fallback.

Streaming works too

Pass "stream": true in the request body. HiWay forwards SSE chunks byte-for-byte and accounts for token usage at the end of the stream.