From OpenRouter

Swap two lines, keep every other part of your integration.

This guide walks you through moving an OpenRouter client to HiWay2LLM. Both endpoints speak the OpenAI chat-completions wire format, so the code diff is minimal — you swap the base_url and the API key, everything else (messages, streaming, tools, retries) stays the same. End-to-end: under 10 minutes.

Prerequisites

A HiWay account (sign up at /auth) and at least one provider API key configured in Dashboard → Settings → Providers. HiWay is BYOK — unlike OpenRouter's hosted wallet, you call providers with your own keys and they bill you directly at wholesale.

Step 1: Get your HiWay API key

Open Dashboard → Keys and click New key. Your key is shown once and starts with hw_live_. Copy it into your secret manager immediately — we only store the SHA-256 hash and cannot recover it.

Step 2: Swap your configuration

On OpenRouter you were pointing at https://openrouter.ai/api/v1 with an sk-or-v1-... key, and likely using fully-qualified model slugs like anthropic/claude-3.5-sonnet. On HiWay you point at https://app.hiway2llm.com/v1 with a hw_live_... key, and you can keep specific model ids or switch to model: "auto" to let the router pick the cheapest capable model per request.

from openai import OpenAI

client = OpenAI(
    base_url="https://app.hiway2llm.com/v1",
    api_key="hw_live_YOUR_KEY",
)

response = client.chat.completions.create(
    model="auto",  # or "anthropic/claude-sonnet-4-5", "openai/gpt-4o", ...
    messages=[{"role": "user", "content": "Explain entropy in one sentence"}],
)

print(response.choices[0].message.content)
print("Routed to:", response.model)

Step 3: Verify

Send one request and inspect the response. Look at the X-HiWay-Routed-Model and X-HiWay-Routed-Tier headers (or the _hiway object on the JSON body) to confirm the router picked a model. Open Dashboard → Usage to see the live timeline — if the request shows up there, you're wired up correctly.

Gotchas specific to OpenRouter

  • Model naming: OpenRouter uses slugs like anthropic/claude-3.5-sonnet with the date suffix baked in. HiWay uses our own canonical ids (anthropic/claude-sonnet-4-5, openai/gpt-4o, etc.) — or just pass "auto" and let us pick. Check Dashboard → Models for the full list of what your enabled providers expose.
  • Streaming: identical on both — pass stream: true and consume SSE chunks the same way. No client-side change needed.
  • Tools / function calling: identical via the OpenAI tools / tool_choice fields. No rewrite.
  • Model catalog: OpenRouter aggregates community providers (Together, Fireworks, DeepInfra, open-source finetunes). HiWay routes through official providers you bring via BYOK (OpenAI, Anthropic, Google, Mistral, DeepSeek). If you relied on a niche community model on OpenRouter, check our catalog before migrating that workload.
  • Accounting: OpenRouter shows a credit balance in their dashboard. HiWay separates concerns — your provider charges hit your OpenAI/Anthropic card directly at wholesale, and HiWay bills you a flat monthly subscription for the routing layer. No per-token markup.

You're done

Should take less than 10 minutes end-to-end. Ping support@hiway2llm.com if you're stuck.