Documentation

Build LLM-powered agents
with production-ready TypeScript

DSPy for TypeScript. Working with LLMs is complex—they don't always do what you want. DSPy makes it easier to build amazing things with LLMs. Just define your inputs and outputs (signature) and an efficient prompt is auto-generated and used. Connect together various signatures to build complex systems and workflows using LLMs.

15+ LLM Providers
End-to-end Streaming
Auto Prompt Tuning

AxAIOpenAIResponsesRequest

Defined in: https://github.com/ax-llm/ax/blob/9a5a7060a48f9eef46efc680b0cdf6b42bff5df2/src/ax/ai/openai/responses_types.ts#L163

Type Parameters

Type ParameterDefault type
TModelAxAIOpenAIResponsesModel

Properties

PropertyModifierType
background?readonlynull | boolean
include?readonly| null | readonly ( | "file_search_call.results" | "message.input_image.image_url" | "computer_call_output.output.image_url" | "reasoning.encrypted_content" | "code_interpreter_call.outputs")[]
inputreadonly| string | readonly AxAIOpenAIResponsesInputItem[]
instructions?readonlynull | string
max_output_tokens?readonlynull | number
metadata?readonlynull | Readonly<Record<string, string>>
modelreadonlyTModel
parallel_tool_calls?readonlynull | boolean
previous_response_id?readonlynull | string
reasoning?readonly| null | { effort?: null | "high" | "low" | "medium"; summary?: null | "auto" | "concise" | "detailed"; }
seed?readonlynull | number
service_tier?readonlynull | "auto" | "default" | "flex"
store?readonlynull | boolean
stream?readonlynull | boolean
temperature?readonlynull | number
text?readonly| null | { format?: | null | { type: "text"; } | { type: "json_object"; } | { json_schema?: object; type: "json_schema"; }; }
tool_choice?readonly| null | AxAIOpenAIResponsesToolChoice
tools?readonly| null | readonly AxAIOpenAIResponsesDefineFunctionTool[]
top_p?readonlynull | number
truncation?readonlynull | "auto" | "disabled"
user?readonlynull | string