How to migrate from OpenRouter to HiWay in 5 minutes
A concrete step-by-step with full before/after code
Migrating from OpenRouter to HiWay2LLM takes 5 minutes. Full before/after code in Python and Node.js, what breaks, what stays the same.
Migrating from OpenRouter to HiWay2LLM is one of the smallest migrations in modern infra. You're swapping one OpenAI-compatible endpoint for another. Your SDK doesn't change. Your streaming code doesn't change. Your tool use doesn't change.
What does change: you bring your own provider keys (BYOK), your billing becomes transparent, and your model names go from OpenRouter's author/model format to direct model IDs.
Here's the full path, timed.
The 5 steps
- Sign up at HiWay2LLM (60 seconds).
- Add your provider keys — Anthropic, OpenAI, Google, Mistral, whichever you use (2 minutes).
- Generate your HiWay API key (30 seconds).
- Change
base_urland API key in your code, update model names (1 minute). - Run your existing test suite, verify nothing broke (1 minute).
That's it. The rest of this post is the code.
Step 1: Sign up
Head to app.hiway2llm.com, create an account. No credit card for the Free tier (2,500 requests/month). Confirm your email.
Step 2: Add your provider keys
In the dashboard, go to Providers and paste in whichever keys you already have:
- Anthropic (for Claude models)
- OpenAI (for GPT and o-series models)
- Google (for Gemini)
- Mistral, Groq, DeepSeek, xAI, Cerebras — optional
Your keys are stored encrypted. They never leave our EU-hosted infrastructure. HiWay uses them to call providers on your behalf; providers bill you directly at their wholesale rate.
This is the BYOK step. If you're coming from OpenRouter, you haven't needed provider accounts before — OpenRouter was the one paying. Create them now if needed; it's a 30-second signup per provider.
Step 3: Generate your HiWay API key
In the dashboard, go to API Keys → Create new key. Give it a name (e.g. production-web), copy it. It looks like:
hw_live_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
Store it in your secret manager or .env. Never commit it.
Step 4: Change your code
This is the entire code change. Because HiWay is OpenAI-compatible, any SDK that talks to OpenAI works unchanged — you only swap the base_url and the API key.
Python (OpenAI SDK)
Before (OpenRouter):
from openai import OpenAI
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key=os.environ["OPENROUTER_API_KEY"],
)
response = client.chat.completions.create(
model="anthropic/claude-sonnet-4.6",
messages=[{"role": "user", "content": "Hello"}],
)
After (HiWay):
from openai import OpenAI
client = OpenAI(
base_url="https://app.hiway2llm.com/v1",
api_key=os.environ["HIWAY_API_KEY"],
)
response = client.chat.completions.create(
model="claude-sonnet-4-6", # direct model ID, no author/ prefix
messages=[{"role": "user", "content": "Hello"}],
)
Two lines changed. SDK imports, streaming, tool use, structured outputs — all identical.
Node.js (OpenAI SDK)
Before:
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://openrouter.ai/api/v1",
apiKey: process.env.OPENROUTER_API_KEY,
});
const response = await client.chat.completions.create({
model: "openai/gpt-4.1",
messages: [{ role: "user", content: "Hello" }],
});
After:
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://app.hiway2llm.com/v1",
apiKey: process.env.HIWAY_API_KEY,
});
const response = await client.chat.completions.create({
model: "gpt-4-1",
messages: [{ role: "user", content: "Hello" }],
});
Vercel AI SDK
Before:
import { createOpenAI } from "@ai-sdk/openai";
const openrouter = createOpenAI({
baseURL: "https://openrouter.ai/api/v1",
apiKey: process.env.OPENROUTER_API_KEY,
});
const result = await streamText({
model: openrouter("anthropic/claude-sonnet-4.6"),
prompt: "Hello",
});
After:
import { createOpenAI } from "@ai-sdk/openai";
const hiway = createOpenAI({
baseURL: "https://app.hiway2llm.com/v1",
apiKey: process.env.HIWAY_API_KEY,
});
const result = await streamText({
model: hiway("claude-sonnet-4-6"),
prompt: "Hello",
});
LangChain (Python)
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://app.hiway2llm.com/v1",
api_key=os.environ["HIWAY_API_KEY"],
model="claude-sonnet-4-6",
)
curl
curl https://app.hiway2llm.com/v1/chat/completions \
-H "Authorization: Bearer $HIWAY_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-sonnet-4-6",
"messages": [{"role": "user", "content": "Hello"}]
}'
No credit card required
What stays the same
- SDK: keep using whatever OpenAI-compatible SDK you have. No rewrites.
- Streaming:
stream: trueworks identically. - Tool use / function calling: same schema.
- Structured outputs / JSON mode: same.
- Usage object:
response.usage.prompt_tokensandcompletion_tokensare returned as expected. - Error codes: 429, 500, 502, 504 mean the same thing.
What changes
Model names
OpenRouter uses author/model (e.g. anthropic/claude-sonnet-4.6). HiWay uses direct model IDs (e.g. claude-sonnet-4-6). You can find the exact ID for every supported model in the dashboard's model catalog.
A quick mapping table for the most common ones:
| OpenRouter | HiWay |
|---|---|
anthropic/claude-opus-4.7 | claude-opus-4-7 |
anthropic/claude-sonnet-4.6 | claude-sonnet-4-6 |
anthropic/claude-haiku-4.6 | claude-haiku-4-6 |
openai/gpt-4.1 | gpt-4-1 |
openai/gpt-4.1-mini | gpt-4-1-mini |
google/gemini-2.5-pro | gemini-2-5-pro |
mistralai/mistral-large | mistral-large |
Billing
OpenRouter charged you directly, inclusive of a markup. HiWay doesn't charge for inference — your providers bill you directly at their wholesale rate, and you pay HiWay a flat subscription (Free, Build at $15, Scale at $39, or Business at $249/month depending on volume). Two invoices instead of one. Slightly more setup, significantly more transparency — and smart routing typically cuts the inference side of the bill by 40-85% on top.
Smart routing (optional)
OpenRouter routes to whichever model you specify. HiWay can do that too — or, if you pass the magic model ID auto, it picks the optimal model per request based on complexity in under 1ms:
response = client.chat.completions.create(
model="auto", # HiWay picks the right tier
messages=[{"role": "user", "content": "Hello"}],
)
Short greetings get Haiku. Hard reasoning gets Opus. Everything in between gets Sonnet. You see which model actually ran in the response metadata.
This is optional. If you want full control, specify model names explicitly.
Step 5: Verify
Run your existing test suite against the new base_url. The assertions that passed with OpenRouter should pass identically with HiWay — same API surface, same semantics.
A quick smoke test:
from openai import OpenAI
client = OpenAI(
base_url="https://app.hiway2llm.com/v1",
api_key=os.environ["HIWAY_API_KEY"],
)
r = client.chat.completions.create(
model="claude-haiku-4-6",
messages=[{"role": "user", "content": "Reply with OK."}],
)
assert "ok" in r.choices[0].message.content.lower()
print("HiWay works.")
If that prints HiWay works., your migration is done.
Rollback plan
If anything goes wrong, point base_url back to https://openrouter.ai/api/v1 and your app is back on OpenRouter instantly. HiWay doesn't lock you in — the whole point of BYOK is that your provider keys are yours, and any OpenAI-compatible router can use them.
Common gotchas
- Model ID typos: the format is
claude-sonnet-4-6(dashes, no dots, no author prefix). The dashboard has a copy button next to each. - Missing provider keys: if you call
claude-sonnet-4-6but haven't added your Anthropic key, HiWay returns a clear error message telling you which provider is missing. - Rate limits: your provider's rate limits apply, not HiWay's. If you were hitting OpenRouter's pooled quota, you'll now hit your own Anthropic quota. Usually this is more headroom, not less.
- Usage stats: OpenRouter's per-request usage UI is in their dashboard. HiWay's is in ours. If you had BI dashboards pointing at OpenRouter, repoint them.
The takeaway
Five minutes. One base_url change. Same SDK. If the migration takes longer than that, something's wrong — usually a missed provider key or a model ID typo.
The bigger win is what happens after: no markup on inference, transparent billing from each provider, optional smart routing, and BYOK keys that follow you if you ever leave.
Next: why we built HiWay in the first place — the founder's story.
Was this useful?
Comments
Be the first to comment.