output-sdk
    Preparing search index...

    Function generateText

    • Use an LLM model to generate text.

      This function is a wrapper over the AI SDK's generateText. The prompt file sets model, messages, temperature, maxTokens, and providerOptions. Additional AI SDK options (tools, maxRetries, etc.) can be passed through.

      Type Parameters

      • Tools extends ToolSet = ToolSet
      • Output extends AIOutput<unknown, unknown> = AIOutput<unknown, unknown>

      Parameters

      • args: { prompt: string; variables?: Record<string, string | number | boolean> } & Partial<
            Omit<CallSettings, "maxOutputTokens">,
        > & {
            activeTools?: (keyof Tools)[];
            maxSteps?: number;
            onStepFinish?: GenerateTextOnStepFinishCallback<Tools>;
            output?: Output;
            prepareStep?: PrepareStepFunction<Tools>;
            stopWhen?: StopCondition<Tools> | StopCondition<Tools>[];
            toolChoice?: ToolChoice<Tools>;
            tools?: Tools;
        }

        Generation arguments.

        • prompt: string

          Prompt file name.

        • Optionalvariables?: Record<string, string | number | boolean>

          Variables to interpolate.

        • prompt

          Prompt file name.

        • variables

          Variables to interpolate.

        • tools

          Tools the model can call (optional).

        • toolChoice

          Tool selection strategy (optional).

        • OptionalactiveTools?: (keyof Tools)[]

          Limit which tools are active without changing types

        • OptionalmaxSteps?: number

          Maximum number of automatic tool execution rounds (multi-step)

        • OptionalonStepFinish?: GenerateTextOnStepFinishCallback<Tools>

          Callback after each step completes

        • Optionaloutput?: Output

          Structured output specification (e.g., Output.object({ schema }))

        • OptionalprepareStep?: PrepareStepFunction<Tools>

          Customize each step before execution

        • OptionalstopWhen?: StopCondition<Tools> | StopCondition<Tools>[]

          Custom stop conditions for multi-step execution

        • OptionaltoolChoice?: ToolChoice<Tools>

          Tool choice strategy: 'auto', 'none', 'required', or specific tool

        • Optionaltools?: Tools

          Tools the model can call

      Returns Promise<GenerateTextResult<Tools, Output>>

      AI SDK response with text and metadata.