Generation arguments.
Prompt file name.
Optionalvariables?: Record<string, string | number | boolean>Variables to interpolate.
Prompt file name.
Variables to interpolate.
Tools the model can call (optional).
Tool selection strategy (optional).
OptionalactiveTools?: (keyof Tools)[]Limit which tools are active without changing types
OptionalmaxSteps?: numberMaximum number of automatic tool execution rounds (multi-step)
OptionalonStepFinish?: GenerateTextOnStepFinishCallback<Tools>Callback after each step completes
Optionaloutput?: OutputStructured output specification (e.g., Output.object({ schema }))
OptionalprepareStep?: PrepareStepFunction<Tools>Customize each step before execution
OptionalstopWhen?: StopCondition<Tools> | StopCondition<Tools>[]Custom stop conditions for multi-step execution
OptionaltoolChoice?: ToolChoice<Tools>Tool choice strategy: 'auto', 'none', 'required', or specific tool
Optionaltools?: ToolsTools the model can call
AI SDK response with text and metadata.
Use an LLM model to generate text.
This function is a wrapper over the AI SDK's
generateText. The prompt file setsmodel,messages,temperature,maxTokens, andproviderOptions. Additional AI SDK options (tools, maxRetries, etc.) can be passed through.