client.chat.send

chat.send issues POST …/v1/chat/completions against the serverURL passed to RouterBrain. Build chatRequest like OpenAI Chat Completions (snake_case messages / content parts, etc.); the SDK normalizes first (including mapping legacy file / video_url to input_file), then serializes to wire JSON.

Streaming vs non-streaming

ModechatRequestReturn valueConsumption
Streamingstream: trueAsync iterable SSE event streamfor await + extractTextDelta(chunk) for text deltas
Non-streamingstream omitted or falseFull ChatResult (loose typing)Parse choices[0].message, etc. yourself

In streaming mode do not run extractTextDelta on the whole response; in non-streaming do not assume a narrow TS type—assert or validate in your app if needed.

Cancellation and per-request options

The second argument is RequestOptions, commonly:

  • signal: AbortController.signal for timeout or user cancel.
  • serverURL: override the client base for this request only (pathname rules still apply).
const ac = new AbortController();
setTimeout(() => ac.abort(), 30_000);

const stream = await client.chat.send(
  {
    chatRequest: {
      model: "openai/gpt-4o-mini",
      messages: [{ role: "user", content: "long task…" }],
      stream: true,
    },
  },
  { signal: ac.signal },
);

Streaming errors

SituationSDK throwsFields
HTTP status not 2xxGatewayHttpErrorstatus, body (raw-ish)
SSE upstream error eventGatewaySseErrorpayload (parsed object)

Wrap for await in try/catch for streaming; catch GatewayHttpError for non-streaming too. More: Error types.

plugins.pdf vs root pdf_preprocess

The server expects PDF preprocess config on the JSON root as pdf_preprocess; OpenAI-style plugins.pdf is more ergonomic. On serialize the SDK:

  1. Normalizes chatRequest (snake → camelCase → wire);
  2. Strips plugins from wire JSON;
  3. If chatRequest.plugins?.pdf exists, mergePdfPreprocessIntoBody merges it to root pdf_preprocess.

If plugins.pdf is unset, no root pdf_preprocess is added; defaults when PDF exists but no config: PDF preprocessing.

extractTextDelta scope

For streaming chunks: read choices[0].delta.content (string or piecewise text/content), optionally fall back to choices[0].delta.reasoning.

Not for: non-streaming ChatResult—read choices[0].message.content (or your wrapper).

See also

  • Node: chat.messages to build messages and plugins before chat.send
  • Chat Completions and input_file (HTTP)
  • packages/sdk/README.md: remapOpenAi…, stringifyChatRequestOpenAiWire, exported types

Back to docs home