Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

AxAIWebLLMChatResponseDelta

type AxAIWebLLMChatResponseDelta = object;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L129

WebLLM: Streaming chat response structure

Properties

choices

choices: object[];

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L134

delta

{
  content?: string;
  role?: "assistant";
  tool_calls?: object[];
}

finish_reason?

optional finish_reason: "stop" | "length" | "tool_calls" | "content_filter";

index

index: number;

logprobs?

{
  content: object[];
}

created

created: number;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L132


id

id: string;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L130


model

model: AxAIWebLLMModel;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L133


object

object: "chat.completion.chunk";

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L131


usage?

optional usage: object;

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/webllm/types.ts#L163

completion_tokens

completion_tokens: number;

prompt_tokens

prompt_tokens: number;

total_tokens

total_tokens: number;