Share via


GeminiPromptExecutionSettings Class

Definition

Represents the settings for executing a prompt with the Gemini model.

[System.Text.Json.Serialization.JsonNumberHandling(System.Text.Json.Serialization.JsonNumberHandling.AllowReadingFromString)]
public sealed class GeminiPromptExecutionSettings : Microsoft.SemanticKernel.PromptExecutionSettings
[<System.Text.Json.Serialization.JsonNumberHandling(System.Text.Json.Serialization.JsonNumberHandling.AllowReadingFromString)>]
type GeminiPromptExecutionSettings = class
    inherit PromptExecutionSettings
Public NotInheritable Class GeminiPromptExecutionSettings
Inherits PromptExecutionSettings
Inheritance
GeminiPromptExecutionSettings
Attributes

Constructors

GeminiPromptExecutionSettings()

Properties

CandidateCount

The count of candidates. Possible values range from 1 to 8.

DefaultTextMaxTokens

Default max tokens for a text generation.

ExtensionData

Extra properties that may be included in the serialized execution settings.

(Inherited from PromptExecutionSettings)
FunctionChoiceBehavior

Gets or sets the behavior defining the way functions are chosen by LLM and how they are invoked by AI connectors.

(Inherited from PromptExecutionSettings)
IsFrozen

Gets a value that indicates whether the PromptExecutionSettings are currently modifiable.

(Inherited from PromptExecutionSettings)
MaxTokens

The maximum number of tokens to generate in the completion.

ModelId

Model identifier. This identifies the AI model these settings are configured for e.g., gpt-4, gpt-3.5-turbo

(Inherited from PromptExecutionSettings)
SafetySettings

Represents a list of safety settings.

ServiceId

Service identifier. This identifies the service these settings are configured for e.g., azure_openai_eastus, openai, ollama, huggingface, etc.

(Inherited from PromptExecutionSettings)
StopSequences

Sequences where the completion will stop generating further tokens. Maximum number of stop sequences is 5.

Temperature

Temperature controls the randomness of the completion. The higher the temperature, the more random the completion. Range is 0.0 to 1.0.

ToolCallBehavior

Gets or sets the behavior for how tool calls are handled.

TopK

Gets or sets the value of the TopK property. The TopK property represents the maximum value of a collection or dataset.

TopP

TopP controls the diversity of the completion. The higher the TopP, the more diverse the completion.

Methods

Clone()

Creates a new PromptExecutionSettings object that is a copy of the current instance.

Freeze()

Makes the current PromptExecutionSettings unmodifiable and sets its IsFrozen property to true.

FromExecutionSettings(PromptExecutionSettings)

Converts a PromptExecutionSettings object to a GeminiPromptExecutionSettings object.

ThrowIfFrozen()

Throws an InvalidOperationException if the PromptExecutionSettings are frozen.

(Inherited from PromptExecutionSettings)

Applies to