Skip to content

OpenAI Package Specification

Scope

  • Owns the OpenAI provider integration for Robota, including GPT model access, streaming support, tool/function calling, and provider-bound request/response adaptation.
  • Owns the OpenAI-branded provider shell, provider definition, payload logging, and public compatibility wrappers.
  • Composes @robota-sdk/agent-provider-openai-compatible for reusable Chat Completions message conversion, response parsing, stream assembly, and OpenAI-compatible endpoint probing.
  • Owns payload logging infrastructure with environment-specific implementations (Node.js file-based, browser console-based).
  • Owns OpenAI-specific API type definitions for request parameters, streaming chunks, tool calls, and error structures.
  • Supports OpenAI-compatible Chat Completions endpoints through baseURL, including local endpoints such as LM Studio.
  • Does not own Gemma-family chat-template marker projection; Gemma local models must use agent-provider-gemma.

Boundaries

  • Does not own generic agent orchestration contracts, executor abstractions, or universal message types -- those belong to @robota-sdk/agent-core.
  • Does not own session management, conversation history, or tool execution logic.
  • Keeps OpenAI-specific transport behavior explicit and provider-scoped.
  • Relies on AbstractAIProvider from @robota-sdk/agent-core as the base class for provider implementation.
  • Logger and executor interfaces (ILogger, IExecutor) are imported from @robota-sdk/agent-core, not redefined.

Architecture Overview

Layer Structure

src/
  index.ts                          # Public API surface (re-exports)
  provider.ts                       # OpenAIProvider (extends AbstractAIProvider)
  provider-definition.ts            # provider definition for CLI/runtime composition
  adapter.ts                        # OpenAIConversationAdapter (static message conversion)
  types.ts                          # Provider options and option value types
  types/
    api-types.ts                    # OpenAI-specific API type definitions
  interfaces/
    payload-logger.ts               # IPayloadLogger / IPayloadLoggerOptions contracts
  parsers/
    response-parser.ts              # OpenAIResponseParser (completion + streaming chunk parsing)
  streaming/
    stream-handler.ts               # OpenAIStreamHandler (modular streaming logic)
    stream-assembler.ts             # Assembles Chat Completions stream chunks into one TUniversalMessage
  loggers/
    index.ts                        # Logger barrel exports
    console.ts                      # Subpath entry: ConsolePayloadLogger
    file.ts                         # Subpath entry: FilePayloadLogger
    console-payload-logger.ts       # Browser console logger implementation
    file-payload-logger.ts          # Node.js file logger implementation
    sanitize-openai-log-data.ts     # SSOT sanitization utility for log data

Design Patterns

  • Adapter pattern: OpenAIConversationAdapter provides static methods for bidirectional conversion between TUniversalMessage and OpenAI.Chat.ChatCompletionMessageParam.
  • Strategy pattern (logging): IPayloadLogger interface with two built-in implementations (FilePayloadLogger, ConsolePayloadLogger) and support for custom implementations.
  • Template method: OpenAIProvider extends AbstractAIProvider, overriding chat, chatStream, validateMessages, supportsTools, validateConfig, and dispose.
  • Executor delegation: When an IExecutor is provided, the provider delegates all chat operations to the executor instead of making direct OpenAI API calls, enabling remote execution.
  • Dependency injection: Logger and payload logger are injected via constructor options. Defaults to SilentLogger when no logger is provided.
  • Provider definition: createOpenAIProviderDefinition() exposes defaults, setup prompts, shared OpenAI-compatible probe behavior, and provider construction through the common IProviderDefinition contract so consumers do not branch on type: "openai".

Type Ownership

Types owned by this package (SSOT):

TypeKindFileDescription
IOpenAIProviderOptionsInterfacetypes.tsConstructor options for OpenAIProvider
TOpenAIProviderOptionValueType aliastypes.tsUnion of valid provider option value types
IOpenAIChatRequestParamsInterfacetypes/api-types.tsOpenAI chat completion request parameters
IOpenAIStreamRequestParamsInterfacetypes/api-types.tsOpenAI streaming request parameters (extends chat params)
IOpenAIToolCallInterfacetypes/api-types.tsOpenAI tool call structure
IOpenAIAssistantMessageInterfacetypes/api-types.tsOpenAI assistant message with optional tool calls
IOpenAIToolMessageInterfacetypes/api-types.tsOpenAI tool response message
IOpenAIStreamDeltaInterfacetypes/api-types.tsStreaming chunk delta structure
IOpenAIStreamChunkInterfacetypes/api-types.tsFull streaming chunk structure
IOpenAIErrorInterfacetypes/api-types.tsOpenAI error structure for type-safe error handling
IOpenAILogDataInterfacetypes/api-types.tsPayload logging data structure
IPayloadLoggerInterfaceinterfaces/payload-logger.tsContract for payload logger implementations
IPayloadLoggerOptionsInterfaceinterfaces/payload-logger.tsConfiguration options for payload loggers
DEFAULT_OPENAI_PROVIDER_MODELConstantprovider-definition.tsOptional setup default model; currently undefined to keep this provider model-family neutral
DEFAULT_OPENAI_COMPATIBLE_PROVIDER_API_KEYConstantprovider-definition.tsPackage-owned local OpenAI-compatible API-key default
DEFAULT_OPENAI_COMPATIBLE_PROVIDER_BASE_URLConstantprovider-definition.tsPackage-owned local OpenAI-compatible base URL default

Types imported from @robota-sdk/agent-core (not owned here):

TypeUsage
TUniversalMessageMessage format for chat/chatStream input and output
IAssistantMessageNarrowed assistant message type with toolCalls
IChatOptionsChat method options (model, temperature, maxTokens, tools)
IToolCallUniversal tool call structure
IToolSchemaTool definition schema for function calling
IExecutorExecutor interface for delegated execution
ILoggerLogger interface for dependency-injected logging
TProviderOptionValueBaseBase type for provider option values
AbstractAIProviderBase class for all AI providers
SilentLoggerDefault no-op logger
LocalExecutorLocal executor implementation (used in tests)

Public API Surface

Main entry point (@robota-sdk/agent-provider-openai)

ExportKindDescription
OpenAIProviderClassPrimary provider class; extends AbstractAIProvider
OpenAIConversationAdapterClassStatic utility for message format conversion
createOpenAIProviderDefinitionFunctionReturns IProviderDefinition for branch-free composition
DEFAULT_OPENAI_PROVIDER_MODELConstantOptional setup default model; currently undefined
DEFAULT_OPENAI_COMPATIBLE_PROVIDER_API_KEYConstantDefault local endpoint API key owned by this package
DEFAULT_OPENAI_COMPATIBLE_PROVIDER_BASE_URLConstantDefault local endpoint base URL owned by this package
IOpenAIProviderOptionsInterfaceProvider constructor options
TOpenAIProviderOptionValueType aliasValid option value types
IPayloadLoggerInterface (type-only)Payload logger contract
IPayloadLoggerOptionsInterface (type-only)Payload logger configuration
All exports from types.tsMixedProvider options and value types
All exports from adapter.tsClassConversation adapter

Subpath entry points

SubpathExportDescription
@robota-sdk/agent-provider-openai/loggers/fileFilePayloadLoggerNode.js file-based payload logger
@robota-sdk/agent-provider-openai/loggers/consoleConsolePayloadLoggerBrowser console-based payload logger

Internal (not exported from main entry)

ClassFileDescription
OpenAIResponseParserparsers/response-parser.tsParses completions and streaming chunks into TUniversalMessage
OpenAIStreamHandlerstreaming/stream-handler.tsModular streaming generator for raw streaming APIs
assembleOpenAIStreamstreaming/stream-assembler.tsAssembles streamed Chat Completions chunks for OpenAIProvider.chat()
sanitizeOpenAILogDataloggers/sanitize-openai-log-data.tsDeep-copy sanitization for log payloads

Extension Points

Custom Payload Logger

Consumers can implement the IPayloadLogger interface to create custom logging backends:

typescript
interface IPayloadLogger {
  isEnabled(): boolean;
  logPayload(payload: IOpenAILogData, type: 'chat' | 'stream'): Promise<void>;
}

Pass the implementation via IOpenAIProviderOptions.payloadLogger.

Executor Delegation

Consumers can provide an IExecutor implementation (e.g., LocalExecutor, RemoteExecutor) via IOpenAIProviderOptions.executor to delegate all chat operations. When an executor is set, no API key or client instance is required.

Custom OpenAI Client

Consumers can pass a pre-configured OpenAI client instance via IOpenAIProviderOptions.client to control SDK configuration (custom base URLs, timeouts, organization settings).

Base URL Override

The baseURL option in IOpenAIProviderOptions allows consumers to point the provider at OpenAI-compatible APIs (e.g., Azure OpenAI, local proxies).

Streaming Assembly for CLI

OpenAIProvider.chat() must honor IChatOptions.onTextDelta. When the callback is provided, chat() uses Chat Completions streaming internally while still returning one complete TUniversalMessage.

Streaming assembly responsibilities:

  • Accumulate delta.content into the final assistant content.
  • Call onTextDelta for every text delta.
  • Accumulate streamed tool_calls by index, preserving id, function name, and partial arguments.
  • Return final toolCalls only after streamed arguments have been assembled.
  • Pass AbortSignal through to the OpenAI SDK request where supported.

The non-streaming Chat Completions path remains supported for callers that do not provide onTextDelta.

Error Taxonomy

This package does not define a custom error class hierarchy. It uses standard Error instances with descriptive messages. Error scenarios:

ConditionError message patternSource
Missing client, apiKey, and executor"Either OpenAI client, apiKey, or executor is required"provider.ts constructor
Missing model in chat options"Model is required in chat options..."provider.ts chat/chatStream
Client unavailable (no executor)"OpenAI client not available..."provider.ts chat/chatStream
API call failure"OpenAI chat failed: <message>"provider.ts chat
Streaming failure"OpenAI stream failed: <message>"provider.ts chat/chatStream
Response parsing failure"OpenAI response parsing failed: <message>"parsers/response-parser.ts
Chunk parsing failure"OpenAI chunk parsing failed: <message>"parsers/response-parser.ts
Stream handler failure"OpenAI streaming failed: <message>"streaming/stream-handler.ts
Tool message missing toolCallId"Tool message missing toolCallId: <json>"adapter.ts
Unsupported message role"Unsupported message role: <role>"adapter.ts, provider.ts

Payload loggers (FilePayloadLogger, ConsolePayloadLogger) catch and log their own errors internally without propagating them, ensuring logging failures do not break main functionality.

Class Contract Registry

Interface Implementations

InterfaceImplementorKindLocation
IPayloadLoggerConsolePayloadLoggerproductionsrc/loggers/console-payload-logger.ts
IPayloadLoggerFilePayloadLoggerproductionsrc/loggers/file-payload-logger.ts

Inheritance Chains

Base (Owner)DerivedLocationNotes
AbstractAIProvider (agents)OpenAIProvidersrc/provider.tsPrimary provider implementation

Cross-Package Port Consumers

Port (Owner)AdapterLocation
AbstractAIProvider (agents)OpenAIProvidersrc/provider.ts
IProviderDefinition (agent-core)createOpenAIProviderDefinitionsrc/provider-definition.ts
OpenAI-compatible endpoint probecreateOpenAIProviderDefinitionsrc/provider-definition.ts

Test Strategy

Current Test Files

FileTypeCoverage
adapter.test.tsUnitOpenAIConversationAdapter -- all message types, tool call content handling, filtering, complete conversation flow
executor-integration.test.tsIntegrationOpenAIProvider with LocalExecutor -- chat, streaming, error handling, mixed mode, initialization
provider.test.tsUnitDirect client path, baseURL construction, non-streaming chat, streaming text/tool-call assembly

Test Gaps

  • No unit tests for FilePayloadLogger or ConsolePayloadLogger.

Released under the MIT License.