Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

AxAIWebLLMChatResponse

type AxAIWebLLMChatResponse = object;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L86

WebLLM: Chat response structure

Properties

choices

choices: object[];

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L91

finish_reason

finish_reason: "stop" | "length" | "tool_calls" | "content_filter";

index

index: number;

logprobs?

{
  content: object[];
}

message

{
  content?: string;
  role: "assistant";
  tool_calls?: object[];
}

created

created: number;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L89


id

id: string;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L87


model

model: AxAIWebLLMModel;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L90


object

object: "chat.completion";

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L88


usage

usage: object;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L119

completion_tokens

completion_tokens: number;

prompt_tokens

prompt_tokens: number;

total_tokens

total_tokens: number;