Gateway quickstart (without @routerbrain/sdk)
You can call your RouterBrain OpenAI-compatible HTTP API directly. Authoritative behavior is in the HTTP service README shipped with your deployment; this page is the minimal path and common pitfalls.
Base URL and paths
Public /v1/*, for example:
https://your-host:8080/v1/chat/completionshttps://your-host:8080/v1/models
Point your URL at a reachable process; the /v1 prefix matches the OpenAI ecosystem so baseURL can end at …/v1.
Authentication
Authorization: Bearer <tenant API key plaintext>
- The key must match the API key created for this HTTP surface in the tenant console (often
sk-prefix). - Data plane checks the DB: only enabled, non-deleted keys work; disabled/deleted → auth failure (status per implementation and README).
Upstream vs tenant keys (one line)
Upstream models and providers come from DB configuration; platform BYOK resolves per provider. In production, also lock down database access and secret storage—see the deployment README “Upstream” section.
Liveness probe
GET /v1/ping returns JSON (e.g. ok, ts), no auth, suitable for load balancers and k8s probes.
curl -sS "$GATEWAY_BASE_URL/v1/ping"
Relation to the SDK
The same routes are used by @routerbrain/sdk: RouterBrain serverURL is the API root (empty, /, or /v1); the SDK appends chat/completions, etc.
Next steps
- HTTP routes overview
- Auth, tenancy, and headers
- Models and routing
-
Chat Completions and
input_file - Tool calls and streaming
- Troubleshooting
- Editors and IDEs
- Agent runtimes (OpenClaw, Hermes, …)
- cURL / fetch examples
- HTTP service README shipped with your deployment