client.chat.messages (Node only)
Injected on the RouterBrain prototype via @routerbrain/sdk/node: reads local files, fetches remote URLs, and turns images / audio / video / PDF into OpenAI-shaped user.content arrays (file-like parts use type: "input_file", matching server Chat validation).
Required side-effect import
import "@routerbrain/sdk/node";
Or value import from the subpath (e.g. import { createRouterBrainFromEnv } from "@routerbrain/sdk/node") so the module executes. import type { … } from "@routerbrain/sdk/node" alone does NOT attach messages.
After mount:
await client.chat.messages({ ... })is equivalent toawait client.chat.messages.fromPaths({ ... });await client.chat.messages.fromTurns(turns, options?)for multi-turn.
fromPaths: return value
Typically { content, plugins }:
content: use as{ role: "user", content }inchat.send;plugins: may include{ pdf: … }when PDF is present; otherwise{}(incomingplugins.pdfmay be ignored if unused).
If only PDF with almost no text, the SDK inserts a short Chinese placeholder (same as repo samples) to avoid an empty user message.
Common fields (summary)
| Field | Role |
|---|---|
prompt | String: trimmed, prefixed as type: "text". Array: shallow-copied OpenAI content parts; path-like media appended per rules. |
image / audio / video / pdfPath | Each https?:// URL or local path; string or string[]. Local PDF/video → input_file (filename + file_data data URL); remote PDF/video URL → input_file + file_url. |
plugins | Optional; RouterBrainChatPlugins (mainly pdf). With PDF and omitting plugins, the SDK may default native engine (typical deployment default). |
Order when combined with prompt
When prompt and path fields coexist, append order is: all images → all audio → all video → all PDF. For strict interleaving, use a prompt array of hand-built parts, or hand-write messages.
Remote URLs and network
Remote image / audio / video / PDF needs fetch. If the network or auth is flaky, prefer local paths or POST /v1/files then file_id in a hand-built input_file.
fromTurns: multi-turn rules
- Each
usermay still attach paths perfromPathsrules; assistant/systemare text-only (promptstring or equivalent)—no path attachments orContentBuildErroris thrown.
Returns { messages, plugins } for chatRequest.messages (different shape from single-turn content).
With chat.send (streaming)
import { RouterBrain, extractTextDelta } from "@routerbrain/sdk";
import "@routerbrain/sdk/node";
const client = new RouterBrain({
serverURL: process.env.GATEWAY_BASE_URL!,
...(process.env.GATEWAY_API_KEY ? { apiKey: process.env.GATEWAY_API_KEY } : {}),
defaultHeaders: { "x-trace-id": "my-trace", "x-agent-name": "my-app" },
});
const { content, plugins } = await client.chat.messages({
prompt: "Describe the image and summarize the PDF.",
image: "/path/to/photo.png",
pdfPath: "/path/to/doc.pdf",
plugins: { pdf: { engine: "ocr", max_pages: 10 } },
});
const stream = await client.chat.send({
chatRequest: {
model: process.env.GATEWAY_MODEL!,
messages: [{ role: "user", content }],
stream: true,
plugins,
},
});
for await (const chunk of stream) {
process.stdout.write(extractTextDelta(chunk));
}
Helpers without fromPaths
The main export still offers fetchAudioAsInputAudioParts, mimeFromExt, isHttpUrl, etc. (README). fetchAudioAsInputAudioParts uses Node Buffer—Node-first.
Browsers (one line)
client.chat.messages depends on fs and other Node APIs—Node only. In the browser, use fetch + hand-built messages, or your own file/base64 pipeline.
See also
-
chat.send - PDF preprocessing (HTTP)
packages/sdk/README.md: fullfromPaths/fromTurnstables and sample scripts