Partilhar via


EmbeddingsClient Class

Definition

The Embeddings service client.

public class EmbeddingsClient
type EmbeddingsClient = class
Public Class EmbeddingsClient
Inheritance
EmbeddingsClient

Constructors

EmbeddingsClient()

Initializes a new instance of EmbeddingsClient for mocking.

EmbeddingsClient(Uri, AzureKeyCredential, AzureAIInferenceClientOptions)

Initializes a new instance of EmbeddingsClient.

EmbeddingsClient(Uri, AzureKeyCredential)

Initializes a new instance of EmbeddingsClient.

EmbeddingsClient(Uri, TokenCredential, AzureAIInferenceClientOptions)

Initializes a new instance of EmbeddingsClient.

EmbeddingsClient(Uri, TokenCredential)

Initializes a new instance of EmbeddingsClient.

Properties

Pipeline

The HTTP pipeline for sending and receiving REST requests and responses.

Methods

Embed(EmbeddingsOptions, CancellationToken)

Return the embedding vectors for given text prompts. The method makes a REST API call to the /embeddings route on the given endpoint.

Embed(RequestContent, String, RequestContext)

[Protocol Method] Return the embedding vectors for given text prompts. The method makes a REST API call to the /embeddings route on the given endpoint.

EmbedAsync(EmbeddingsOptions, CancellationToken)

Return the embedding vectors for given text prompts. The method makes a REST API call to the /embeddings route on the given endpoint.

EmbedAsync(RequestContent, String, RequestContext)

[Protocol Method] Return the embedding vectors for given text prompts. The method makes a REST API call to the /embeddings route on the given endpoint.

GetModelInfo(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfo(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfoAsync(CancellationToken)

Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

GetModelInfoAsync(RequestContext)

[Protocol Method] Returns information about the AI model. The method makes a REST API call to the /info route on the given endpoint. This method will only work when using Serverless API or Managed Compute endpoint. It will not work for GitHub Models endpoint or Azure OpenAI endpoint.

Applies to