Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

ai

function ai<T>(options: T): AxAI<InferTModelKey<T>>;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/ai/wrap.ts#L192

Factory function for creating AI service instances with full type safety.

This is the recommended way to create AI instances. It automatically selects the appropriate provider implementation based on the name field and provides type-safe access to provider-specific models.

Supported Providers:

See

Type Parameters

Type Parameter
T extends AxAIArgs<any>

Parameters

ParameterTypeDescription
optionsTProvider-specific configuration. Must include name to identify the provider.

Returns

AxAI<InferTModelKey<T>>

A configured AI service instance ready for chat completions and embeddings

Examples

const ai = ai({
  name: 'openai',
  apiKey: process.env.OPENAI_API_KEY
});
const ai = ai({
  name: 'anthropic',
  apiKey: process.env.ANTHROPIC_API_KEY,
  config: {
    model: 'claude-sonnet-4-20250514',
    maxTokens: 4096,
    temperature: 0.7
  }
});
const ai = ai({
  name: 'google-gemini',
  apiKey: process.env.GOOGLE_API_KEY,
  models: [
    { key: 'fast', model: 'gemini-2.0-flash' },
    { key: 'smart', model: 'gemini-1.5-pro' }
  ]
});
// Now use ai with model: 'fast' or model: 'smart'
const ai = ai({
  name: 'ollama',
  config: { model: 'llama3.2' }
});