client.chat.send
chat.send issues POST …/v1/chat/completions against the serverURL passed to RouterBrain. Build chatRequest like OpenAI Chat Completions (snake_case messages / content parts, etc.); the SDK normalizes first (including mapping legacy file / video_url to input_file), then serializes to wire JSON.
Streaming vs non-streaming
| Mode | chatRequest | Return value | Consumption |
|---|---|---|---|
| Streaming | stream: true | Async iterable SSE event stream | for await + extractTextDelta(chunk) for text deltas |
| Non-streaming | stream omitted or false | Full ChatResult (loose typing) | Parse choices[0].message, etc. yourself |
In streaming mode do not run extractTextDelta on the whole response; in non-streaming do not assume a narrow TS type—assert or validate in your app if needed.
Cancellation and per-request options
The second argument is RequestOptions, commonly:
signal:AbortController.signalfor timeout or user cancel.serverURL: override the client base for this request only (pathname rules still apply).
const ac = new AbortController();
setTimeout(() => ac.abort(), 30_000);
const stream = await client.chat.send(
{
chatRequest: {
model: "openai/gpt-4o-mini",
messages: [{ role: "user", content: "long task…" }],
stream: true,
},
},
{ signal: ac.signal },
);
Streaming errors
| Situation | SDK throws | Fields |
|---|---|---|
| HTTP status not 2xx | GatewayHttpError | status, body (raw-ish) |
SSE upstream error event | GatewaySseError | payload (parsed object) |
Wrap for await in try/catch for streaming; catch GatewayHttpError for non-streaming too. More: Error types.
plugins.pdf vs root pdf_preprocess
The server expects PDF preprocess config on the JSON root as pdf_preprocess; OpenAI-style plugins.pdf is more ergonomic. On serialize the SDK:
- Normalizes
chatRequest(snake → camelCase → wire); - Strips
pluginsfrom wire JSON; - If
chatRequest.plugins?.pdfexists,mergePdfPreprocessIntoBodymerges it to rootpdf_preprocess.
If plugins.pdf is unset, no root pdf_preprocess is added; defaults when PDF exists but no config: PDF preprocessing.
extractTextDelta scope
For streaming chunks: read choices[0].delta.content (string or piecewise text/content), optionally fall back to choices[0].delta.reasoning.
Not for: non-streaming ChatResult—read choices[0].message.content (or your wrapper).
See also
- Node:
chat.messagesto buildmessagesandpluginsbeforechat.send - Chat Completions and
input_file(HTTP) packages/sdk/README.md:remapOpenAi…,stringifyChatRequestOpenAiWire, exported types