ai
function ai<T>(options: T): AxAI<InferTModelKey<T>>;
Defined in: https://github.com/ax-llm/ax/blob/05ff5bd88d050f7ba85a3fcc6eb0ed2975ad7d51/src/ax/ai/wrap.ts#L192
Factory function for creating AI service instances with full type safety.
This is the recommended way to create AI instances. It automatically selects
the appropriate provider implementation based on the name field and provides
type-safe access to provider-specific models.
Supported Providers:
'openai'- OpenAI (GPT-4, GPT-4o, o1, o3, etc.)'openai-responses'- OpenAI Responses API (for web search, file search)'anthropic'- Anthropic (Claude 3.5 Sonnet, Claude 3 Opus, etc.)'google-gemini'- Google (Gemini 1.5 Pro, Gemini 2.0 Flash, etc.)'azure-openai'- Azure OpenAI Service'groq'- Groq (Llama, Mixtral with fast inference)'cohere'- Cohere (Command R+, embeddings)'mistral'- Mistral AI (Mistral Large, Codestral)'deepseek'- DeepSeek (DeepSeek-V3, DeepSeek-R1)'together'- Together AI (various open models)'openrouter'- OpenRouter (unified API for many providers)'ollama'- Ollama (local models)'huggingface'- Hugging Face Inference API'reka'- Reka AI'grok'- xAI Grok'webllm'- WebLLM (browser-based inference)
See
- AxModelConfig for model configuration options
- AxAIServiceOptions for runtime options like streaming and function calling
Type Parameters
| Type Parameter |
|---|
T extends AxAIArgs<any> |
Parameters
| Parameter | Type | Description |
|---|---|---|
options | T | Provider-specific configuration. Must include name to identify the provider. |
Returns
AxAI<InferTModelKey<T>>
A configured AI service instance ready for chat completions and embeddings
Examples
const ai = ai({
name: 'openai',
apiKey: process.env.OPENAI_API_KEY
});
const ai = ai({
name: 'anthropic',
apiKey: process.env.ANTHROPIC_API_KEY,
config: {
model: 'claude-sonnet-4-20250514',
maxTokens: 4096,
temperature: 0.7
}
});
const ai = ai({
name: 'google-gemini',
apiKey: process.env.GOOGLE_API_KEY,
models: [
{ key: 'fast', model: 'gemini-2.0-flash' },
{ key: 'smart', model: 'gemini-1.5-pro' }
]
});
// Now use ai with model: 'fast' or model: 'smart'
const ai = ai({
name: 'ollama',
config: { model: 'llama3.2' }
});