Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

AxOptimizer

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L962

Interface for optimizing AI programs through automated prompt tuning.

Optimizers improve program performance by finding optimal demonstrations (few-shot examples) that guide the AI toward better outputs. The optimization process:

  1. Runs the program on training examples
  2. Evaluates outputs using the metric function
  3. Selects high-scoring input/output pairs as demonstrations
  4. Iterates to find the best demonstration set

Example

const optimizer = new AxBootstrapFewShot({ maxBootstrappedDemos: 4 });

const result = await optimizer.compile(
  program,
  trainingExamples,
  ({ prediction, example }) => prediction.answer === example.expectedAnswer ? 1 : 0
);

// Apply optimized demos to program
program.setDemos(result.demos);

Methods

cancel()?

optional cancel(): Promise<void>;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1083

Cancel ongoing optimization gracefully

Returns

Promise<void>

Promise that resolves when cancellation is complete


compile()

compile<IN, OUT>(
   program: Readonly<AxGen<IN, OUT>>, 
   examples: readonly AxTypedExample<IN>[], 
   metricFn: AxMetricFn, 
options?: AxCompileOptions): Promise<AxOptimizerResult<OUT>>;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1036

Optimize a program using the provided training examples and metric function.

The optimizer runs the program on examples, scores the outputs, and builds a set of demonstrations that improve program performance. The process is automatic - you provide the data and metric, the optimizer does the rest.

Metric Function: The metric function evaluates how well the program’s output matches expectations. It receives the prediction and original example, and should return a score between 0 (worst) and 1 (best).

Type Parameters

Type Parameter
IN
OUT extends AxGenOut

Parameters

ParameterTypeDescription
programReadonly<AxGen<IN, OUT>>The AxGen program to optimize. The optimizer will call .forward() on this program during training.
examplesreadonly AxTypedExample<IN>[]Training examples with input values. Examples are automatically split into training and validation sets. Each example should have the input fields required by the program’s signature.
metricFnAxMetricFnFunction that scores program outputs. Called with: - prediction: The program’s output for this example - example: The original input example (useful if it contains expected outputs) - Returns: Number between 0 (bad) and 1 (perfect)
options?AxCompileOptionsOptional configuration: - ai: AI service to use (required if not set on program) - valSet: Custom validation set (otherwise auto-split from examples) - trainSplit: Fraction of examples for training (default: 0.8)

Returns

Promise<AxOptimizerResult<OUT>>

Promise resolving to optimization results including:

Examples

const result = await optimizer.compile(
  qa,
  examples,
  ({ prediction, example }) => {
    return prediction.answer.toLowerCase() === example.expectedAnswer.toLowerCase() ? 1 : 0;
  }
);
const result = await optimizer.compile(
  summarizer,
  examples,
  async ({ prediction, example }) => {
    const similarity = await computeCosineSimilarity(
      await embed(prediction.summary),
      await embed(example.referenceSummary)
    );
    return similarity; // 0-1 based on embedding similarity
  }
);
const result = await optimizer.compile(
  extractor,
  examples,
  ({ prediction, example }) => {
    const expectedKeywords = example.keywords;
    const foundKeywords = prediction.keywords;
    const matches = expectedKeywords.filter(k => foundKeywords.includes(k));
    return matches.length / expectedKeywords.length; // Fraction found
  }
);

compilePareto()?

optional compilePareto<IN, OUT>(
   program: Readonly<AxGen<IN, OUT>>, 
   examples: readonly AxTypedExample<IN>[], 
   metricFn: AxMultiMetricFn, 
options?: AxCompileOptions): Promise<AxParetoResult<OUT>>;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1066

Multi-objective optimization using Pareto frontier

Type Parameters

Type Parameter
IN
OUT extends AxGenOut

Parameters

ParameterTypeDescription
programReadonly<AxGen<IN, OUT>>The program to optimize
examplesreadonly AxTypedExample<IN>[]Training examples
metricFnAxMultiMetricFnMulti-objective metric function
options?AxCompileOptionsOptional configuration options

Returns

Promise<AxParetoResult<OUT>>

Pareto optimization result


compileStream()?

optional compileStream<IN, OUT>(
   program: Readonly<AxGen<IN, OUT>>, 
   examples: readonly AxTypedExample<IN>[], 
   metricFn: AxMetricFn, 
options?: AxCompileOptions): AsyncIterableIterator<AxOptimizationProgress>;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1051

Optimize a program with real-time streaming updates

Type Parameters

Type Parameter
IN
OUT extends AxGenOut

Parameters

ParameterTypeDescription
programReadonly<AxGen<IN, OUT>>The program to optimize
examplesreadonly AxTypedExample<IN>[]Training examples
metricFnAxMetricFnEvaluation metric function
options?AxCompileOptionsOptional configuration options

Returns

AsyncIterableIterator<AxOptimizationProgress>

Async iterator yielding optimization progress


getConfiguration()?

optional getConfiguration(): Record<string, unknown>;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1094

Get optimizer-specific configuration

Returns

Record<string, unknown>

Current optimizer configuration


getStats()

getStats(): AxOptimizationStats;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1077

Get current optimization statistics

Returns

AxOptimizationStats

Current optimization statistics


reset()?

optional reset(): void;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1088

Reset optimizer state for reuse with different programs

Returns

void


updateConfiguration()?

optional updateConfiguration(config: Readonly<Record<string, unknown>>): void;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1100

Update optimizer configuration

Parameters

ParameterTypeDescription
configReadonly<Record<string, unknown>>New configuration to merge with existing

Returns

void


validateProgram()?

optional validateProgram<IN, OUT>(program: Readonly<AxGen<IN, OUT>>): object;

Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/dsp/optimizer.ts#L1107

Validate that the optimizer can handle the given program

Type Parameters

Type Parameter
IN
OUT extends AxGenOut

Parameters

ParameterTypeDescription
programReadonly<AxGen<IN, OUT>>Program to validate

Returns

object

Validation result with any issues found

NameType
issuesstring[]
isValidboolean
suggestionsstring[]