Share via


MistralAIPromptExecutionSettings.MaxTokens Property

Definition

Default: null The maximum number of tokens to generate in the completion.

[System.Text.Json.Serialization.JsonPropertyName("max_tokens")]
public int? MaxTokens { get; set; }
[<System.Text.Json.Serialization.JsonPropertyName("max_tokens")>]
member this.MaxTokens : Nullable<int> with get, set
Public Property MaxTokens As Nullable(Of Integer)

Property Value

Attributes

Remarks

The token count of your prompt plus max_tokens cannot exceed the model's context length.

Applies to