Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

AxAIOpenAIResponsesImpl

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L62

Type Parameters

Type Parameter
TModel
TEmbedModel
TResponsesReq extends AxAIOpenAIResponsesRequest<TModel>

Implements

Constructors

Constructor

new AxAIOpenAIResponsesImpl<TModel, TEmbedModel, TResponsesReq>(
   config: Readonly<AxAIOpenAIResponsesConfig<TModel, TEmbedModel>>, 
   streamingUsage: boolean, 
responsesReqUpdater?: ResponsesReqUpdater<TModel, TResponsesReq>): AxAIOpenAIResponsesImpl<TModel, TEmbedModel, TResponsesReq>;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L79

Parameters

ParameterType
configReadonly<AxAIOpenAIResponsesConfig<TModel, TEmbedModel>>
streamingUsageboolean
responsesReqUpdater?ResponsesReqUpdater<TModel, TResponsesReq>

Returns

AxAIOpenAIResponsesImpl<TModel, TEmbedModel, TResponsesReq>

Methods

createChatReq()

createChatReq(req: Readonly<AxInternalChatRequest<TModel>>, config: Readonly<AxAIServiceOptions>): [Readonly<AxAPI>, Readonly<AxAIOpenAIResponsesRequest<TModel>>];

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L263

Parameters

ParameterType
reqReadonly<AxInternalChatRequest<TModel>>
configReadonly<AxAIServiceOptions>

Returns

[Readonly<AxAPI>, Readonly<AxAIOpenAIResponsesRequest<TModel>>]

Implementation of

AxAIServiceImpl.createChatReq


createChatResp()

createChatResp(resp: Readonly<AxAIOpenAIResponsesResponse>): Readonly<AxChatResponse>;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L490

Parameters

ParameterType
respReadonly<AxAIOpenAIResponsesResponse>

Returns

Readonly<AxChatResponse>

Implementation of

AxAIServiceImpl.createChatResp


createChatStreamResp()

createChatStreamResp(streamEvent: Readonly<AxAIOpenAIResponsesResponseDelta>): Readonly<AxChatResponse>;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L670

Parameters

ParameterType
streamEventReadonly<AxAIOpenAIResponsesResponseDelta>

Returns

Readonly<AxChatResponse>

Implementation of

AxAIServiceImpl.createChatStreamResp


createEmbedReq()

createEmbedReq(req: Readonly<AxInternalEmbedRequest<TEmbedModel>>): [AxAPI, AxAIOpenAIEmbedRequest<TEmbedModel>];

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L1097

Parameters

ParameterType
reqReadonly<AxInternalEmbedRequest<TEmbedModel>>

Returns

[AxAPI, AxAIOpenAIEmbedRequest<TEmbedModel>]

Implementation of

AxAIServiceImpl.createEmbedReq


getModelConfig()

getModelConfig(): Readonly<AxModelConfig>;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L94

Returns

Readonly<AxModelConfig>

Implementation of

AxAIServiceImpl.getModelConfig


getTokenUsage()

getTokenUsage(): undefined | Readonly<AxTokenUsage>;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_api.ts#L90

Returns

undefined | Readonly<AxTokenUsage>

Implementation of

AxAIServiceImpl.getTokenUsage