Get started with the Azure OpenAI security building block

This article shows you how to create and use the Azure OpenAI security building block sample. The purpose is to demonstrate Azure OpenAI account provisioning with role-based access control (RBAC) for keyless (Microsoft Entra ID) authentication to Azure OpenAI. This chat app sample also includes all the infrastructure and configuration needed to provision Azure OpenAI resources and deploy the app to Azure Container Apps using the Azure Developer CLI.

By following the instructions in this article, you will:

  • Deploy a secure Azure Container chat app.
  • Use managed identity for Azure OpenAI access.
  • Chat with an Azure OpenAI Large Language Model (LLM) using the OpenAI library.

Once you complete this article, you can start modifying the new project with your custom code and data.

Note

This article uses one or more AI app templates as the basis for the examples and guidance in the article. AI app templates provide you with well-maintained, easy to deploy reference implementations that help to ensure a high-quality starting point for your AI apps.

Architectural overview

A simple architecture of the chat app is shown in the following diagram: Diagram showing architecture from client to backend app.

The chat app runs as an Azure Container App. The app uses managed identity via Microsoft Entra ID to authenticate with Azure OpenAI, instead of an API key. The chat app uses Azure OpenAI to generate responses to user messages.

The application architecture relies on the following services and components:

  • Azure OpenAI represents the AI provider that we send the user's queries to.
  • Azure Container Apps is the container environment where the application is hosted.
  • Managed Identity helps us ensure best-in-class security and eliminates the requirement for you as a developer to securely manage a secret.
  • Bicep files for provisioning Azure resources, including Azure OpenAI, Azure Container Apps, Azure Container Registry, Azure Log Analytics, and RBAC roles.
  • Microsoft AI Chat Protocol provides standardized API contracts across AI solutions and languages. The chat app conforms to the Microsoft AI Chat Protocol, which allows the evaluations app to run against any chat app that conforms to the protocol.
  • A Python Quart that uses the openai package to generate responses to user messages.
  • A basic HTML/JavaScript frontend that streams responses from the backend using JSON Lines over a ReadableStream.
  • A Blazor web app that uses the Azure.AI.OpenAI NuGet package to generate responses to user messages.
  • A TypeScript web app that uses the openai npm package to generate responses to user messages.

Cost

In an attempt to keep pricing as low as possible in this sample, most resources use a basic or consumption pricing tier. Alter your tier level as needed based on your intended usage. To stop incurring charges, delete the resources when you're done with the article.

Learn more about cost in the sample repo.

Learn more about cost in the sample repo.

Learn more about cost in the sample repo.

Prerequisites

A development container environment is available with all dependencies required to complete this article. You can run the development container in GitHub Codespaces (in a browser) or locally using Visual Studio Code.

To use this article, you need to fulfill the following prerequisites:

Open development environment

Use the following instructions to deploy a preconfigured development environment containing all required dependencies to complete this article.

GitHub Codespaces runs a development container managed by GitHub with Visual Studio Code for the Web as the user interface. For the most straightforward development environment, use GitHub Codespaces so that you have the correct developer tools and dependencies preinstalled to complete this article.

Important

All GitHub accounts can use Codespaces for up to 60 hours free each month with 2 core instances. For more information, see GitHub Codespaces monthly included storage and core hours.

Use the following steps to create a new GitHub Codespace on the main branch of the Azure-Samples/openai-chat-app-quickstart GitHub repository.

  1. Right-click on the following button, and select Open link in new window. This action allows you to have the development environment and the documentation available for review.

  2. On the Create codespace page, review and then select Create new codespace

    Screenshot of the confirmation screen before creating a new codespace.

  3. Wait for the codespace to start. This startup process can take a few minutes.

  4. Sign in to Azure with the Azure Developer CLI in the terminal at the bottom of the screen.

    azd auth login
    
  5. Copy the code from the terminal and then paste it into a browser. Follow the instructions to authenticate with your Azure account.

The remaining tasks in this article take place in the context of this development container.

Use the following steps to create a new GitHub Codespace on the main branch of the Azure-Samples/openai-chat-app-quickstart-dotnet GitHub repository.

  1. Right-click on the following button, and select Open link in new window. This action allows you to have the development environment and the documentation available for review.

  2. On the Create codespace page, review and then select Create codespace

    Screenshot of the confirmation screen before creating a new codespace.

  3. Wait for the codespace to start. This startup process can take a few minutes.

  4. Sign in to Azure with the Azure Developer CLI in the terminal at the bottom of the screen.

    azd auth login
    
  5. Copy the code from the terminal and then paste it into a browser. Follow the instructions to authenticate with your Azure account.

The remaining tasks in this article take place in the context of this development container.

Use the following steps to create a new GitHub Codespace on the main branch of the Azure-Samples/openai-chat-app-quickstart-javascript GitHub repository.

  1. Right-click on the following button, and select Open link in new window. This action allows you to have the development environment and the documentation available for review.

Open in GitHub Codespaces

  1. On the Create codespace page, review and then select Create new codespace

    Screenshot of the confirmation screen before creating a new codespace.

  2. Wait for the codespace to start. This startup process can take a few minutes.

  3. Sign in to Azure with the Azure Developer CLI in the terminal at the bottom of the screen.

    azd auth login
    
  4. Copy the code from the terminal and then paste it into a browser. Follow the instructions to authenticate with your Azure account.

The remaining tasks in this article take place in the context of this development container.

Deploy and run

The sample repository contains all the code and configuration files for chat app Azure deployment. The following steps walk you through the sample chat app Azure deployment process.

Deploy chat app to Azure

Important

Azure resources created in this section incur immediate costs. These resources may accrue costs even if you interrupt the command before it is fully executed.

  1. Run the following Azure Developer CLI command for Azure resource provisioning and source code deployment:

    azd up
    
  2. Use the following table to answer the prompts:

    Prompt Answer
    Environment name Keep it short and lowercase. Add your name or alias. For example, secure-chat. It's used as part of the resource group name.
    Subscription Select the subscription to create the resources in.
    Location (for hosting) Select a location near you from the list.
    Location for the OpenAI model Select a location near you from the list. If the same location is available as your first location, select that.
  3. Wait until app is deployed. Deployment usually takes between 5 and 10 minutes to complete.

Use chat app to ask questions to the Large Language Model

  1. The terminal displays a URL after successful application deployment.

  2. Select that URL labeled Deploying service web to open the chat application in a browser.

    Screenshot of chat app in browser showing several suggestions for chat input and the chat text box to enter a question.

  3. In the browser, enter a question such as "Why is managed identity better than keys?".

  4. The answer comes from Azure OpenAI and the result is displayed.

Exploring the sample code

While OpenAI and Azure OpenAI Service rely on a common Python client library, small code changes are needed when using Azure OpenAI endpoints. Let's see how this sample configures keyless authentication with Microsoft Entra ID and communicates with Azure OpenAI.

Configure authentication with managed identity

In this sample, the src\quartapp\chat.py file begins with configuring keyless authentication.

The following snippet uses the azure.identity.aio module to create an asynchronous Microsoft Entra authentication flow.

The following code snippet uses the AZURE_CLIENT_ID azd environment variable to create a ManagedIdentityCredential instance capable of authenticating via user-assigned managed identity.

user_assigned_managed_identity_credential = ManagedIdentityCredential(client_id=os.getenv("AZURE_CLIENT_ID")) 

Note

The azd resource environment variables are provisioned during azd app deployment.

The following code snippet uses AZURE_TENANT_ID azd resource environment variable to create an AzureDeveloperCliCredential instance capable of authenticating with the current Microsoft Entra tenant.

azure_dev_cli_credential = AzureDeveloperCliCredential(tenant_id=os.getenv("AZURE_TENANT_ID"), process_timeout=60)  

The Azure Identity client library provides credentials—public classes that implement the Azure Core library's TokenCredential protocol. A credential represents a distinct authentication flow for acquiring an access token from Microsoft Entra ID. These credentials can be chained together to form an ordered sequence of authentication mechanisms to be attempted.

The following snippet creates a ChainedTokenCredential using a ManagedIdentityCredential and an AzureDeveloperCliCredential:

  • The ManagedIdentityCredential is used for Azure Functions and Azure App Service. A user-assigned managed identity is supported by passing the client_id to ManagedIdentityCredential.
  • The AzureDeveloperCliCredential is used for local development. It was set previously based on the Microsoft Entra tenant to use.
azure_credential = ChainedTokenCredential(
    user_assigned_managed_identity_credential,
    azure_dev_cli_credential
)

Tip

The order of the credentials is important, as the first valid Microsoft Entra access token is used. For more information, check out the ChainedTokenCredential Overview article.

The following code snippet gets the Azure OpenAI token provider based on the selected Azure credential. This value is obtained by calling the azure.identity.aio.get_bearer_token_provider with two arguments:

  • azure_credential: The ChainedTokenCredential instance created earlier to authenticate the request.

  • https://cognitiveservices.azure.com/.default: Required one or more bearer token scopes. In this case, the Azure Cognitive Services endpoint.

token_provider = get_bearer_token_provider(
    azure_credential, "https://cognitiveservices.azure.com/.default"
)

The following lines check for the required AZURE_OPENAI_ENDPOINT and AZURE_OPENAI_CHATGPT_DEPLOYMENT azd resource environment variables, which are provisioned during azd app deployment. An error is thrown if a value isn't present.

if not os.getenv("AZURE_OPENAI_ENDPOINT"):
    raise ValueError("AZURE_OPENAI_ENDPOINT is required for Azure OpenAI")
if not os.getenv("AZURE_OPENAI_CHATGPT_DEPLOYMENT"):
    raise ValueError("AZURE_OPENAI_CHATGPT_DEPLOYMENT is required for Azure OpenAI")

This snippet initializes the Azure OpenAI client, setting the api_version, azure_endpoint, and azure_ad_token_provider (client_args) parameters:

bp.openai_client = AsyncAzureOpenAI(
    api_version=os.getenv("AZURE_OPENAI_API_VERSION") or "2024-02-15-preview",
    azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
    azure_ad_token_provider=token_provider,
)  

The following line sets the Azure OpenAI model deployment name for use in API calls:

bp.openai_model = os.getenv("AZURE_OPENAI_CHATGPT_DEPLOYMENT")

Note

OpenAI uses the model keyword argument to specify what model to use. Azure OpenAI has the concept of unique model deployments. When you use Azure OpenAI, model should refer to the underlying deployment name chosen during Azure OpenAI model deployment.

Once this function completes, the client is properly configured and ready to interact with Azure OpenAI services.

Response stream using the OpenAI Client and model

The response_stream handles the chat completion call in the route. The following code snippet shows how openai_client and model are used.

async def response_stream():
    # This sends all messages, so API request may exceed token limits
    all_messages = [
        {"role": "system", "content": "You are a helpful assistant."},
    ] + request_messages

    chat_coroutine = bp.openai_client.chat.completions.create(
        # Azure Open AI takes the deployment name as the model name
        model=bp.openai_model,
        messages=all_messages,
        stream=True,
    )

Explore the sample code

.NET applications rely on the Azure.AI.OpenAI client library to communicate with Azure OpenAI services, which takes a dependency on the OpenAI library. The sample app configures keyless authentication using Microsoft Entra ID to communicate with Azure OpenAI.

Configure authentication and service registration

In this sample, keyless authentication is configured in the program.cs file. The following code snippet uses the AZURE_CLIENT_ID environment variable set by azd to create a ManagedIdentityCredential instance capable of authenticating via user-assigned managed identity.

var userAssignedIdentityCredential = 
    new ManagedIdentityCredential(builder.Configuration.GetValue<string>("AZURE_CLIENT_ID"));

Note

The azd resource environment variables are provisioned during azd app deployment.

The following code snippet uses the AZURE_TENANT_ID environment variable set by azd to create an AzureDeveloperCliCredential instance capable of authenticating locally using the account signed-in to azd.

var azureDevCliCredential = new AzureDeveloperCliCredential(
    new AzureDeveloperCliCredentialOptions()
    { 
        TenantId = builder.Configuration.GetValue<string>("AZURE_TENANT_ID") 
    });

The Azure Identity client library provides credential classes that implement the Azure Core library's TokenCredential protocol. A credential represents a distinct authentication flow for acquiring an access token from Microsoft Entra ID. These credentials can be chained together using ChainedTokenCredential to form an ordered sequence of authentication mechanisms to be attempted.

The following snippet registers the AzureOpenAIClient for dependency injection and creates a ChainedTokenCredential using a ManagedIdentityCredential and an AzureDeveloperCliCredential:

  • The ManagedIdentityCredential is used for Azure Functions and Azure App Service. A user-assigned managed identity is supported using the AZURE_CLIENT_ID that was provided to the ManagedIdentityCredential.
  • The AzureDeveloperCliCredential is used for local development. It was set previously based on the Microsoft Entra tenant to use.
builder.Services.AddAzureClients(
    clientBuilder => {
        clientBuilder.AddClient<AzureOpenAIClient, AzureOpenAIClientOptions>((options, _, _)
            => new AzureOpenAIClient(
                new Uri(endpoint),
                new ChainedTokenCredential(
                    userAssignedIdentityCredential, azureDevCliCredential), options));
    });

Tip

The order of the credentials is important, as the first valid Microsoft Entra access token is used. For more information, check out the ChainedTokenCredential Overview article.

Get chat completions using the Azure OpenAI client

The Blazor web app injects the registered AzureOpenAIClient at the top of the Home.Razor component:

@inject AzureOpenAIClient azureOpenAIClient

When the user submits the form, the AzureOpenAIClient sends their prompt to the OpenAI model to generate a completion:

ChatClient chatClient = azureOpenAIClient.GetChatClient("gpt-4o-mini");

messages.Add(new UserChatMessage(model.UserMessage));

ChatCompletion completion = await chatClient.CompleteChatAsync(messages);
    messages.Add(new SystemChatMessage(completion.Content[0].Text));

Explore the sample code

While OpenAI and Azure OpenAI Service rely on a openai (common JavaScript client library), small code changes are needed when using Azure OpenAI endpoints. Let's see how this sample configures keyless authentication with Microsoft Entra ID and communicates with Azure OpenAI.

Keyless authentication for each environment

The Azure Identity client library provides credential classes that implement the Azure Core library's TokenCredential protocol. A credential represents a distinct authentication flow for acquiring an access token from Microsoft Entra ID. These credentials can be chained together using ChainedTokenCredential to form an ordered sequence of authentication mechanisms to be attempted. This allows you to deploy the same code in both production and local development environments.

Diagram showing the two credentials in the flow where the managed identity is tried first then the default Azure credential is tried.

Configure authentication with managed identity

In this sample, the ./src/azure-authentication.ts provides several functions to provide keyless authentication to Azure OpenAI.

The first function, getChainedCredential(), returns the first valid Azure credential found in the chain.

function getChainedCredential() {

    return new ChainedTokenCredential(
        new ManagedIdentityCredential(process.env.AZURE_CLIENT_ID!), 
        new AzureDeveloperCliCredential({
            tenantId: process.env.AZURE_TENANT_ID! ? process.env.AZURE_TENANT_ID! : undefined
          })
    );
}
  • ManagedIdentityCredential is attempted first. It's set up with the AZURE_CLIENT_ID environment variable in the production runtime and is capable of authenticating via user-assigned managed identity.
  • AzureDeveloperCliCredential is attempted second. It's set up when a develop signs in with Azure CLI az login.

Tip

The order of the credentials is important, as the first valid Microsoft Entra access token is used. For more information, check out the ChainedTokenCredential Overview article.

Get bearer token for OpenAI

The second function in ./src/azure-authentication.ts is getTokenProvider(), which returns a callback that provides a bearer token scoped to the Azure Cognitive Services endpoint.

function getTokenProvider(): () => Promise<string> {
    const credential  = getChainedCredential();
    const scope = "https://cognitiveservices.azure.com/.default";
    return getBearerTokenProvider(credential, scope);
}

The preceding code snippet uses getBearerTokenProvider to take the credential and the scope, then returns a callback that provides a bearer token.

Create authenticated Azure OpenAI client

The third function in ./src/azure-authentication.ts is getOpenAiClient(), which returns the Azure OpenAI client.

export function getOpenAiClient(): AzureOpenAI | undefined{
    try {

        if (!process.env.AZURE_OPENAI_ENDPOINT) {
            throw new Error("AZURE_OPENAI_ENDPOINT is required for Azure OpenAI");
        }
        if (!process.env.AZURE_OPENAI_CHAT_DEPLOYMENT) {
            throw new Error("AZURE_OPENAI_CHAT_DEPLOYMENT is required for Azure OpenAI");
        }

        const options = { 
            azureADTokenProvider: getTokenProvider(), 
            deployment: process.env.AZURE_OPENAI_CHAT_DEPLOYMENT!, 
            apiVersion: process.env.AZURE_OPENAI_API_VERSION! || "2024-02-15-preview",
            endpoint: process.env.AZURE_OPENAI_ENDPOINT!
        }

        // Create the Asynchronous Azure OpenAI client
        return new AzureOpenAI (options);

    } catch (error) {
        console.error('Error getting Azure OpenAI client: ', error);
    }
}

This code takes the options, including the correctly scoped token, and creates the AzureOpenAI client

Stream chat answer with Azure OpenAI

Use the following Fastify route handler in ./src/openai-chat-api.ts to send a message to Azure OpenAI and stream the response.

import { FastifyReply, FastifyRequest } from 'fastify';
import { AzureOpenAI } from "openai";
import { getOpenAiClient } from './azure-authentication.js';
import { ChatCompletionChunk, ChatCompletionMessageParam } from 'openai/resources/chat/completions';

interface ChatRequestBody {
    messages: ChatCompletionMessageParam [];
  }

export async function chatRoute (request: FastifyRequest<{ Body: ChatRequestBody }>, reply: FastifyReply) {

    const requestMessages: ChatCompletionMessageParam[] = request?.body?.messages;
    const openaiClient: AzureOpenAI | undefined = getOpenAiClient();

    if (!openaiClient) {
      throw new Error("Azure OpenAI client is not configured");
    }

    const allMessages = [
      { role: "system", content: "You are a helpful assistant."},
      ...requestMessages
    ] as ChatCompletionMessageParam [];

    const chatCompletionChunks = await openaiClient.chat.completions.create({
      // Azure Open AI takes the deployment name as the model name
      model: process.env.AZURE_OPENAI_CHAT_DEPLOYMENT_MODEL || "gpt-4o-mini",
      messages: allMessages,
      stream: true

    })
    reply.raw.setHeader('Content-Type', 'text/html; charset=utf-8');
    reply.raw.setHeader('Cache-Control', 'no-cache');
    reply.raw.setHeader('Connection', 'keep-alive');
    reply.raw.flushHeaders();

    for await (const chunk of chatCompletionChunks as AsyncIterable<ChatCompletionChunk>) {
      for (const choice of chunk.choices) {
        reply.raw.write(JSON.stringify(choice) + "\n")
      }
    }

    reply.raw.end()

}

The function gets the chat conversation, including any previous messages, and sends them to Azure OpenAI. As the stream chunks are returned from Azure OpenAI, the are sent to the client.

Other security considerations

This article demonstrates how the sample uses ChainedTokenCreadential for authenticating to the Azure OpenAI service.

The sample also has a GitHub Action that scans the infrastructure-as-code files and generates a report containing any detected issues. To ensure continued best practices in your own repository, we recommend that anyone creating solutions based on our templates ensure that the GitHub secret scanning setting is enabled.

Consider other security measures, such as:

Clean up resources

Clean up Azure resources

The Azure resources created in this article are billed to your Azure subscription. If you don't expect to need these resources in the future, delete them to avoid incurring more charges.

To delete the Azure resources and remove the source code, run the following Azure Developer CLI command:

azd down --purge

Clean up GitHub Codespaces

Deleting the GitHub Codespaces environment ensures that you can maximize the amount of free per-core hours entitlement you get for your account.

Important

For more information about your GitHub account's entitlements, see GitHub Codespaces monthly included storage and core hours.

  1. Sign into the GitHub Codespaces dashboard (https://github.com/codespaces).

  2. Locate your currently running Codespaces sourced from the Azure-Samples/openai-chat-app-quickstart GitHub repository.

  3. Open the context menu for the codespace and then select Delete.

  1. Sign into the GitHub Codespaces dashboard (https://github.com/codespaces).

  2. Locate your currently running Codespaces sourced from the Azure-Samples/openai-chat-app-quickstart-dotnet GitHub repository.

  3. Open the context menu for the codespace and then select Delete.

  1. Sign into the GitHub Codespaces dashboard (https://github.com/codespaces).

  2. Locate your currently running Codespaces sourced from the Azure-Samples/openai-chat-app-quickstart-javascript GitHub repository.

  3. Open the context menu for the codespace and then select Delete.

Get help

If your issue isn't addressed, log your issue to the repository's Issues.

Next steps

If your issue isn't addressed, log your issue to the repository's Issues.

If your issue isn't addressed, log your issue to the repository's Issues.