Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

Quick Start Guide

This guide will get you from zero to your first AI application in 5 minutes.

Prerequisites

Installation

npm install @ax-llm/ax

Step 1: Set Up Your API Key

Create a .env file in your project root:

OPENAI_APIKEY=your-api-key-here

Or export it in your terminal:

export OPENAI_APIKEY=your-api-key-here

Step 2: Your First AI Program

Create a file called hello-ai.ts:

import { ai, ax } from "@ax-llm/ax";

// Initialize your AI provider
const llm = ai({ 
  name: "openai", 
  apiKey: process.env.OPENAI_APIKEY! 
});

// Create a simple classifier
const sentimentAnalyzer = ax(
  'reviewText:string -> sentiment:class "positive, negative, neutral"'
);

// Use it!
async function analyze() {
  const result = await sentimentAnalyzer.forward(llm, {
    reviewText: "This product exceeded all my expectations!"
  });
  
  console.log(`Sentiment: ${result.sentiment}`);
}

analyze();

Step 3: Run Your Program

npx tsx hello-ai.ts

You should see:

Sentiment: positive

What Just Happened?

  1. No prompt engineering - You didn’t write any prompts, just described what you wanted
  2. Type safety - TypeScript knows that result.sentiment is one of your three classes
  3. Automatic optimization - The framework generated an optimal prompt for you
  4. Provider agnostic - This same code works with Claude, Gemini, or any other LLM

Next: Add Streaming

Want to see results as they generate? Add one parameter:

const result = await sentimentAnalyzer.forward(
  llm, 
  { reviewText: "Great product!" },
  { stream: true }  // ← Enable streaming
);

Next: Multi-Modal (Images)

Work with images just as easily:

import fs from "fs";

const imageAnalyzer = ax(
  'photo:image, question:string -> answer:string'
);

const imageData = fs.readFileSync("photo.jpg").toString("base64");

const result = await imageAnalyzer.forward(llm, {
  photo: { mimeType: "image/jpeg", data: imageData },
  question: "What's in this image?"
});

Next: Complex Workflows

Build multi-step processes:

const documentProcessor = ax(`
  documentText:string -> 
  summary:string "2-3 sentences",
  keyPoints:string[] "main points",
  sentiment:class "positive, negative, neutral"
`);

const result = await documentProcessor.forward(llm, {
  documentText: "Your long document here..."
});

console.log(`Summary: ${result.summary}`);
console.log(`Key Points: ${result.keyPoints.join(", ")}`);
console.log(`Sentiment: ${result.sentiment}`);

Using Different Providers

OpenAI

const llm = ai({ 
  name: "openai", 
  apiKey: process.env.OPENAI_APIKEY!,
  config: { model: "gpt-4o" }  // Optional: specify model
});

Anthropic Claude

const llm = ai({ 
  name: "anthropic", 
  apiKey: process.env.ANTHROPIC_APIKEY!,
  config: { model: "claude-3-5-sonnet-20241022" }
});

Google Gemini

const llm = ai({ 
  name: "google-gemini", 
  apiKey: process.env.GOOGLE_APIKEY!,
  config: { model: "gemini-1.5-pro" }
});

Local Ollama

const llm = ai({ 
  name: "ollama",
  config: { model: "llama3.2" }
});

Field Types Reference

TypeExampleDescription
stringname:stringText input/output
numberscore:numberNumeric values
booleanisValid:booleanTrue/false
classcategory:class "a,b,c"Enumeration
string[]tags:string[]Array of strings
jsondata:jsonAny JSON object
imagephoto:imageImage input
audiorecording:audioAudio input
datedueDate:dateDate value
?notes?:stringOptional field

Common Patterns

Classification

const classifier = ax(
  'text:string -> category:class "option1, option2, option3"'
);

Extraction

const extractor = ax(
  'document:string -> names:string[], dates:date[], amounts:number[]'
);

Question Answering

const qa = ax(
  'context:string, question:string -> answer:string'
);

Translation

const translator = ax(
  'text:string, targetLanguage:string -> translation:string'
);

Error Handling

try {
  const result = await gen.forward(llm, input);
} catch (error) {
  console.error("Generation failed:", error);
}

Debug Mode

See what’s happening under the hood:

const llm = ai({ 
  name: "openai", 
  apiKey: process.env.OPENAI_APIKEY!,
  options: { debug: true }  // Enable debug logging
});

What’s Next?

Now that you have the basics:

  1. Explore Examples - Check out the examples directory for real-world patterns
  2. Learn DSPy Concepts - Understand the revolutionary approach
  3. Build Workflows - Create complex systems with AxFlow
  4. Optimize Performance - Make your programs smarter with optimization
  5. Add Observability - Monitor production apps with telemetry

Need Help?


Remember: You’re not writing prompts, you’re declaring capabilities. Let the framework handle the complexity while you focus on building.