Share via


OpenAIAudioToTextExecutionSettings Class

Definition

Execution settings for OpenAI audio-to-text request.

public sealed class OpenAIAudioToTextExecutionSettings : Microsoft.SemanticKernel.PromptExecutionSettings
type OpenAIAudioToTextExecutionSettings = class
    inherit PromptExecutionSettings
Public NotInheritable Class OpenAIAudioToTextExecutionSettings
Inherits PromptExecutionSettings
Inheritance
OpenAIAudioToTextExecutionSettings

Constructors

OpenAIAudioToTextExecutionSettings()

Creates an instance of OpenAIAudioToTextExecutionSettings class with default filename - "file.mp3".

OpenAIAudioToTextExecutionSettings(String)

Creates an instance of OpenAIAudioToTextExecutionSettings class.

Properties

ExtensionData

Extra properties that may be included in the serialized execution settings.

(Inherited from PromptExecutionSettings)
Filename

Filename or identifier associated with audio data. Should be in format {filename}.{extension}

FunctionChoiceBehavior

Gets or sets the behavior defining the way functions are chosen by LLM and how they are invoked by AI connectors.

(Inherited from PromptExecutionSettings)
IsFrozen

Gets a value that indicates whether the PromptExecutionSettings are currently modifiable.

(Inherited from PromptExecutionSettings)
Language

An optional language of the audio data as two-letter ISO-639-1 language code (e.g. 'en' or 'es').

ModelId

Model identifier. This identifies the AI model these settings are configured for e.g., gpt-4, gpt-3.5-turbo

(Inherited from PromptExecutionSettings)
Prompt

An optional text to guide the model's style or continue a previous audio segment. The prompt should match the audio language.

ResponseFormat

The format of the transcript output, in one of these options: json, srt, verbose_json, or vtt. Default is 'json'.

ServiceId

Service identifier. This identifies the service these settings are configured for e.g., azure_openai_eastus, openai, ollama, huggingface, etc.

(Inherited from PromptExecutionSettings)
Temperature

The sampling temperature, between 0 and 1. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic. If set to 0, the model will use log probability to automatically increase the temperature until certain thresholds are hit. Default is 0.

Methods

Clone()

Creates a new PromptExecutionSettings object that is a copy of the current instance.

Freeze()

Makes the current PromptExecutionSettings unmodifiable and sets its IsFrozen property to true.

(Inherited from PromptExecutionSettings)
FromExecutionSettings(PromptExecutionSettings)

Converts PromptExecutionSettings to derived OpenAIAudioToTextExecutionSettings type.

ThrowIfFrozen()

Throws an InvalidOperationException if the PromptExecutionSettings are frozen.

(Inherited from PromptExecutionSettings)

Applies to