output-sdk
    Preparing search index...

    Function generateEnum

    • Use an LLM model to generate a result from an enum (array of string values).

      Type Parameters

      • const TEnum extends readonly [string, string]

      Parameters

      • args: {
            enum: TEnum;
            prompt: string;
            variables?: Record<string, string | number | boolean>;
        } & Partial<Omit<CallSettings, "maxOutputTokens">>

        Generation arguments.

        • enum: TEnum

          Allowed values for the generation.

        • prompt: string

          Prompt file name.

        • Optionalvariables?: Record<string, string | number | boolean>

          Variables to interpolate.

        • prompt

          Prompt file name.

        • variables

          Variables to interpolate.

        • enum

          Allowed values for the generation.

      Returns Promise<GenerateObjectResult<TEnum[number]>>

      AI SDK response with enum value and metadata.

      Since v0.3.0. Use generateText() with Output.choice({ options }) instead. Will be removed in v1.0.0.

      // Before (deprecated):
      const { object } = await generateEnum({ prompt: 'my_prompt', enum: ['yes', 'no', 'maybe'] });

      // After (recommended):
      const { output } = await generateText({ prompt: 'my_prompt', output: Output.choice({ options: ['yes', 'no', 'maybe'] }) });

      generateText for the recommended replacement