Skip to content

OpenAI-Compatible Provider Primitives Specification

Scope

This package owns reusable OpenAI-compatible Chat Completions transport primitives for Robota provider packages. It provides message conversion, tool schema conversion, Chat Completions response parsing, streaming assembly, and endpoint probing that can be composed by branded or model-family providers.

Boundaries

  • Does not own an end-user provider class. Provider classes belong to packages such as agent-provider-openai and agent-provider-gemma.
  • Does not own OpenAI account semantics, API-key defaults, payload logging products, or OpenAI-branded public compatibility.
  • Does not own Gemma-specific reasoning marker or native tool-call text policy. Gemma projection strategies belong to agent-provider-gemma.
  • Does not own generic agent orchestration, universal message contracts, or executor contracts. Those belong to agent-core.

Architecture Overview

src/
  index.ts                 # public exports
  types.ts                 # OpenAI-compatible transport types
  message-converter.ts     # universal message and tool conversion
  response-parser.ts       # Chat Completions response and chunk parsing
  stream-assembler.ts      # streaming chunk assembly with optional injected projection
  endpoint-probe.ts        # OpenAI-compatible /models endpoint probe

The package is a functional core for provider transport logic. Concrete providers remain the imperative shell that creates SDK clients, handles credentials, logs payloads, and chooses model-family projection behavior.

Type Ownership

TypeLocationPurpose
IOpenAICompatibleChatRequestParamssrc/types.tsChat Completions request shape used by provider shells.
IOpenAICompatibleStreamRequestParamssrc/types.tsStreaming Chat Completions request shape.
IOpenAICompatibleErrorsrc/types.tsMinimal upstream error contract used for message wrapping.
IOpenAICompatibleLogDatasrc/types.tsModel-neutral payload log summary contract.
TOpenAICompatibleTextProjectorsrc/types.tsStateful or stateless text projection hook for model-family providers.
TOpenAICompatibleTextProjectorFlushsrc/types.tsFlush hook for stateful streaming projectors that hold partial marker prefixes.
IOpenAICompatibleToolCallTextProjectionsrc/types.tsResult from an injected provider-owned text-to-tool-call projector.
IOpenAICompatibleToolCallTextProjectorsrc/types.tsProvider-owned projector that converts known native tool-call text into tool calls.
IOpenAICompatibleStreamAssemblyOptionssrc/types.tsOptions for assembling streamed chunks into one universal assistant message.
IOpenAICompatibleModelsResponsesrc/endpoint-probe.tsMinimal /models response contract used for endpoint probes.
TOpenAICompatibleFetchsrc/endpoint-probe.tsFetch adapter contract for testable OpenAI-compatible endpoint probes.

Public API Surface

ExportKindDescription
convertToOpenAICompatibleMessagesfunctionConverts TUniversalMessage[] to OpenAI Chat Completions messages.
convertToOpenAICompatibleToolsfunctionConverts Robota IToolSchema[] to OpenAI function tools.
OpenAICompatibleResponseParserclassParses full responses and streaming chunks into TUniversalMessage.
assembleOpenAICompatibleStreamfunctionAssembles streaming chunks into one assistant message, emits projected text deltas, and exits deterministically on abort.
probeOpenAICompatibleProfilefunctionProbes a profile's /models endpoint without making concrete provider assumptions.
types from types.tstype exportsTransport request, logging, error, and projection contracts.

Extension Points

  • Providers may pass a TOpenAICompatibleTextProjector and optional flush hook into response parsing or stream assembly to transform model-family output before user-facing rendering.
  • Providers may pass an IOpenAICompatibleToolCallTextProjector when a documented provider-owned serving template emits native tool-call text instead of OpenAI tool_calls. The shared package only calls the injected strategy; it must not infer model names, tool names, or prompt directives.
  • Providers own client creation and may use any OpenAI-compatible endpoint that the OpenAI SDK can target.
  • Providers own payload logging and diagnostic raw-data retention policy.

Error Taxonomy

ConditionError patternSource
Unsupported universal message roleUnsupported message role: <role>message-converter.ts
Missing tool message idTool message missing toolCallId: <json>message-converter.ts
Missing response choiceOpenAI-compatible response parsing failed: No choices found...response-parser.ts
Chunk parsing failureOpenAI-compatible chunk parsing failed: <message>response-parser.ts

Test Strategy

  • Unit tests cover message conversion for user, assistant, system, tool, and function tools.
  • Unit tests cover full response parsing and streaming chunk parsing.
  • Unit tests cover stream assembly with text delta callbacks, native tool-call assembly, injected text-tool-call projection, abort handling while awaiting the next chunk, optional projection, and projector flush behavior.
  • Unit tests cover endpoint probe skip, success, and HTTP failure behavior.
  • Provider packages must add integration tests proving they compose this package without changing their public behavior.

Class Contract Registry

Interface Implementations

None. This package exposes pure functions and a parser utility class.

Inheritance Chains

None. Provider inheritance remains in concrete provider packages.

Cross-Package Port Consumers

Port (Owner)ConsumerLocation
TUniversalMessage (agent-core)converter/parser functionssrc/message-converter.ts, src/response-parser.ts, src/stream-assembler.ts
IToolSchema (agent-core)convertToOpenAICompatibleToolssrc/message-converter.ts
IProviderProfileConfig (agent-core)probeOpenAICompatibleProfilesrc/endpoint-probe.ts

Released under the MIT License.