AxAIServiceImpl
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L657
Type Parameters
| Type Parameter |
|---|
TModel |
TEmbedModel |
TChatRequest |
TEmbedRequest |
TChatResponse |
TChatResponseDelta |
TEmbedResponse |
Methods
buildCacheCreateOp()?
optional buildCacheCreateOp(req: Readonly<AxInternalChatRequest<TModel>>, options: Readonly<AxAIServiceOptions>):
| undefined
| AxContextCacheOperation;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L703
Optional: Build a context cache creation operation. Called when a new cache needs to be created from the request.
Parameters
| Parameter | Type |
|---|---|
req | Readonly<AxInternalChatRequest<TModel>> |
options | Readonly<AxAIServiceOptions> |
Returns
| undefined
| AxContextCacheOperation
buildCacheDeleteOp()?
optional buildCacheDeleteOp(cacheName: string): AxContextCacheOperation;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L719
Optional: Build a context cache deletion operation.
Parameters
| Parameter | Type |
|---|---|
cacheName | string |
Returns
buildCacheUpdateTTLOp()?
optional buildCacheUpdateTTLOp(cacheName: string, ttlSeconds: number): AxContextCacheOperation;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L711
Optional: Build a context cache TTL update operation.
Parameters
| Parameter | Type |
|---|---|
cacheName | string |
ttlSeconds | number |
Returns
createChatReq()
createChatReq(req: Readonly<AxInternalChatRequest<TModel>>, config?: Readonly<AxAIServiceOptions>):
| [AxAPI, TChatRequest]
| Promise<[AxAPI, TChatRequest]>;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L666
Parameters
| Parameter | Type |
|---|---|
req | Readonly<AxInternalChatRequest<TModel>> |
config? | Readonly<AxAIServiceOptions> |
Returns
| [AxAPI, TChatRequest]
| Promise<[AxAPI, TChatRequest]>
createChatResp()
createChatResp(resp: Readonly<TChatResponse>): AxChatResponse;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L671
Parameters
| Parameter | Type |
|---|---|
resp | Readonly<TChatResponse> |
Returns
createChatStreamResp()?
optional createChatStreamResp(resp: Readonly<TChatResponseDelta>, state: object): AxChatResponse;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L673
Parameters
| Parameter | Type |
|---|---|
resp | Readonly<TChatResponseDelta> |
state | object |
Returns
createEmbedReq()?
optional createEmbedReq(req: Readonly<AxInternalEmbedRequest<TEmbedModel>>):
| [AxAPI, TEmbedRequest]
| Promise<[AxAPI, TEmbedRequest]>;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L678
Parameters
| Parameter | Type |
|---|---|
req | Readonly<AxInternalEmbedRequest<TEmbedModel>> |
Returns
| [AxAPI, TEmbedRequest]
| Promise<[AxAPI, TEmbedRequest]>
createEmbedResp()?
optional createEmbedResp(resp: Readonly<TEmbedResponse>): AxEmbedResponse;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L682
Parameters
| Parameter | Type |
|---|---|
resp | Readonly<TEmbedResponse> |
Returns
getModelConfig()
getModelConfig(): AxModelConfig;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L684
Returns
getTokenUsage()
getTokenUsage(): undefined | AxTokenUsage;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L686
Returns
undefined | AxTokenUsage
prepareCachedChatReq()?
optional prepareCachedChatReq(
req: Readonly<AxInternalChatRequest<TModel>>,
options: Readonly<AxAIServiceOptions>,
existingCacheName?: string): Promise<AxPreparedChatRequest<TChatRequest>>;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L693
Optional: Prepare a chat request with context cache support. Providers implement this to support explicit context caching. Returns cache operations to execute and the modified request.
Parameters
| Parameter | Type |
|---|---|
req | Readonly<AxInternalChatRequest<TModel>> |
options | Readonly<AxAIServiceOptions> |
existingCacheName? | string |
Returns
Promise<AxPreparedChatRequest<TChatRequest>>
supportsContextCache()?
optional supportsContextCache(model: TModel): boolean;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L725
Optional: Check if explicit context caching is supported (e.g., Gemini). Explicit caching creates a separate cache resource with an ID.
Parameters
| Parameter | Type |
|---|---|
model | TModel |
Returns
boolean
supportsImplicitCaching()?
optional supportsImplicitCaching(model: TModel): boolean;
Defined in: https://github.com/ax-llm/ax/blob/a8847bd2906efff202fde10d776fddd20fd2ff57/src/ax/ai/types.ts#L731
Optional: Check if implicit context caching is supported (e.g., Anthropic). Implicit caching marks content in the request; provider handles caching automatically.
Parameters
| Parameter | Type |
|---|---|
model | TModel |
Returns
boolean