Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

AxAIServiceImpl

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L657

Type Parameters

Type Parameter
TModel
TEmbedModel
TChatRequest
TEmbedRequest
TChatResponse
TChatResponseDelta
TEmbedResponse

Methods

buildCacheCreateOp()?

optional buildCacheCreateOp(req: Readonly<AxInternalChatRequest<TModel>>, options: Readonly<AxAIServiceOptions>): 
  | undefined
  | AxContextCacheOperation;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L703

Optional: Build a context cache creation operation. Called when a new cache needs to be created from the request.

Parameters

ParameterType
reqReadonly<AxInternalChatRequest<TModel>>
optionsReadonly<AxAIServiceOptions>

Returns

| undefined | AxContextCacheOperation


buildCacheDeleteOp()?

optional buildCacheDeleteOp(cacheName: string): AxContextCacheOperation;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L719

Optional: Build a context cache deletion operation.

Parameters

ParameterType
cacheNamestring

Returns

AxContextCacheOperation


buildCacheUpdateTTLOp()?

optional buildCacheUpdateTTLOp(cacheName: string, ttlSeconds: number): AxContextCacheOperation;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L711

Optional: Build a context cache TTL update operation.

Parameters

ParameterType
cacheNamestring
ttlSecondsnumber

Returns

AxContextCacheOperation


createChatReq()

createChatReq(req: Readonly<AxInternalChatRequest<TModel>>, config?: Readonly<AxAIServiceOptions>): 
  | [AxAPI, TChatRequest]
| Promise<[AxAPI, TChatRequest]>;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L666

Parameters

ParameterType
reqReadonly<AxInternalChatRequest<TModel>>
config?Readonly<AxAIServiceOptions>

Returns

| [AxAPI, TChatRequest] | Promise<[AxAPI, TChatRequest]>


createChatResp()

createChatResp(resp: Readonly<TChatResponse>): AxChatResponse;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L671

Parameters

ParameterType
respReadonly<TChatResponse>

Returns

AxChatResponse


createChatStreamResp()?

optional createChatStreamResp(resp: Readonly<TChatResponseDelta>, state: object): AxChatResponse;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L673

Parameters

ParameterType
respReadonly<TChatResponseDelta>
stateobject

Returns

AxChatResponse


createEmbedReq()?

optional createEmbedReq(req: Readonly<AxInternalEmbedRequest<TEmbedModel>>): 
  | [AxAPI, TEmbedRequest]
| Promise<[AxAPI, TEmbedRequest]>;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L678

Parameters

ParameterType
reqReadonly<AxInternalEmbedRequest<TEmbedModel>>

Returns

| [AxAPI, TEmbedRequest] | Promise<[AxAPI, TEmbedRequest]>


createEmbedResp()?

optional createEmbedResp(resp: Readonly<TEmbedResponse>): AxEmbedResponse;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L682

Parameters

ParameterType
respReadonly<TEmbedResponse>

Returns

AxEmbedResponse


getModelConfig()

getModelConfig(): AxModelConfig;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L684

Returns

AxModelConfig


getTokenUsage()

getTokenUsage(): undefined | AxTokenUsage;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L686

Returns

undefined | AxTokenUsage


prepareCachedChatReq()?

optional prepareCachedChatReq(
   req: Readonly<AxInternalChatRequest<TModel>>, 
   options: Readonly<AxAIServiceOptions>, 
existingCacheName?: string): Promise<AxPreparedChatRequest<TChatRequest>>;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L693

Optional: Prepare a chat request with context cache support. Providers implement this to support explicit context caching. Returns cache operations to execute and the modified request.

Parameters

ParameterType
reqReadonly<AxInternalChatRequest<TModel>>
optionsReadonly<AxAIServiceOptions>
existingCacheName?string

Returns

Promise<AxPreparedChatRequest<TChatRequest>>


supportsContextCache()?

optional supportsContextCache(model: TModel): boolean;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L725

Optional: Check if explicit context caching is supported (e.g., Gemini). Explicit caching creates a separate cache resource with an ID.

Parameters

ParameterType
modelTModel

Returns

boolean


supportsImplicitCaching()?

optional supportsImplicitCaching(model: TModel): boolean;

Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L731

Optional: Check if implicit context caching is supported (e.g., Anthropic). Implicit caching marks content in the request; provider handles caching automatically.

Parameters

ParameterType
modelTModel

Returns

boolean