Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

This browser is no longer supported.

Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.

Download Microsoft EdgeMore info about Internet Explorer and Microsoft Edge
Table of contentsExit editor mode

Create and deploy an Azure OpenAI in Microsoft Foundry Models resource

Feedback

In this article

Note

This document refers to theMicrosoft Foundry (classic) portal.

🔍View the Microsoft Foundry (new) documentation to learn about the new portal.

Deploy to Azure

This article describes how to get started with Azure OpenAI and provides step-by-step instructions to create a resource and deploy a model. You can create resources in Azure in several different ways:

  • TheAzure portal
  • The REST APIs, the Azure CLI, PowerShell, or client libraries
  • Azure Resource Manager (ARM) templates

In this article, you review examples for creating and deploying resources in the Azure portal, with the Azure CLI, and with PowerShell.

Prerequisites

Create a resource

The following steps show how to create an Azure OpenAI resource in the Azure portal.

Identify the resource

  1. Sign in with your Azure subscription in the Azure portal.

  2. SelectCreate a resource and search for theAzure OpenAI. When you locate the service, selectCreate.

    Screenshot that shows how to create a new Azure OpenAI in Microsoft Foundry Models resource in the Azure portal.

  3. On theCreate Azure OpenAI page, provide the following information for the fields on theBasics tab:

    FieldDescription
    SubscriptionThe Azure subscription used in your Azure OpenAI onboarding application.
    Resource groupThe Azure resource group to contain your Azure OpenAI resource. You can create a new group or use a pre-existing group.
    RegionThe location of your instance. Different locations can introduce latency, but they don't affect the runtime availability of your resource.
    NameA descriptive name for your Azure OpenAI resource, such asMyOpenAIResource.
    Pricing TierThe pricing tier for the resource. Currently, only the Standard tier is available for the Azure OpenAI. For more info on pricing visit theAzure OpenAI pricing page

    Screenshot that shows how to configure an Azure OpenAI resource in the Azure portal.

  4. SelectNext.

Configure network security

TheNetwork tab presents three options for the securityType:

  • Option 1:All networks, including the internet, can access this resource.
  • Option 2:Selected networks, configure network security for your Foundry Tools resource.
  • Option 3:Disabled, no networks can access this resource. You could configure private endpoint connections that will be the exclusive way to access this resource.

Screenshot that shows the network security options for an Azure OpenAI resource in the Azure portal.

Depending on the option you select, you might need to provide additional information.

Option 1: Allow all networks

The first option allows all networks, including the internet, to access your resource. This option is the default setting. No extra settings are required for this option.

Option 2: Allow specific networks only

The second option lets you identify specific networks that can access your resource. When you select this option, the page updates to include the following required fields:

FieldDescription
Virtual networkSpecify the virtual networks that are permitted access to your resource. You can edit the default virtual network name in the Azure portal.
SubnetsSpecify the subnets that are permitted access to your resource. You can edit the default subnet name in the Azure portal.

Screenshot that shows how to configure network security for an Azure OpenAI resource to allow specific networks only.

TheFirewall section provides an optionalAddress range field that you can use to configure firewall settings for the resource.

Option 3: Disable network access

The third option lets you disable network access to your resource. When you select this option, the page updates to include thePrivate endpoint table.

Screenshot that shows how to disable network security for an Azure OpenAI resource in the Azure portal.

As an option, you can add a private endpoint for access to your resource. SelectAdd private endpoint, and complete the endpoint configuration.

Confirm the configuration and create the resource

  1. SelectNext and configure anyTags for your resource, as desired.

  2. SelectNext to move to the final stage in the process:Review + submit.

  3. Confirm your configuration settings, and selectCreate.

  4. The Azure portal displays a notification when the new resource is available. SelectGo to resource.

    Screenshot showing the Go to resource button in the Azure portal.

Deploy a model

Before you can generate text or inference, you need to deploy a model. You can select from one of several available models in Foundry portal.

To deploy a model, follow these steps:

  1. Sign in toMicrosoft Foundry. Make sure theNew Foundry toggle is off. These steps refer toFoundry (classic).

  2. InKeep building with Foundry section selectView all resources.

  3. Find and select your resource.

    Important

    At this step you might be offered to upgrade your Azure OpenAI resource to Foundry. See comparison between the two resource types and details on resource upgrade and rollback atthis page. SelectCancel to proceed without resource type upgrade. Alternately selectNext.

    See additional information about Foundry resource inthis article.

  4. SelectDeployments fromShared resources section in the left pane. (In case you upgraded to Foundry in the previous step, selectModels + endpoints fromMy assets section in the left pane.)

  5. Select+ Deploy model >Deploy base model to open the deployment window.

  6. Select the desired model and then selectConfirm. For a list of available models per region, seeModel summary table and region availability.

  7. In the next window configure the following fields:

    FieldDescription
    Deployment nameChoose a name carefully. The deployment name is used in your code to call the model by using the client libraries and the REST APIs.
    Deployment typeStandard,Global-Batch,Global-Standard,Provisioned-Managed. Learn more aboutdeployment type options.
    Deployment details (Optional)You can set optional advanced settings, as needed for your resource.
    - For theContent Filter, assign a content filter to your deployment.
    - For theTokens per Minute Rate Limit, adjust the Tokens per Minute (TPM) to set the effective rate limit for your deployment. You can modify this value at any time by using theQuotas menu.Dynamic Quota allows you to take advantage of more quota when extra capacity is available.

    Important

    When you access the model via the API, you need to refer to the deployment name rather than the underlying model name in API calls, which is one of thekey differences between OpenAI and Azure OpenAI. OpenAI only requires the model name. Azure OpenAI always requires deployment name, even when using the model parameter. In our documentation, we often have examples where deployment names are represented as identical to model names to help indicate which model works with a particular API endpoint. Ultimately your deployment names can follow whatever naming convention is best for your use case.

  8. SelectDeploy.

  9. DeploymentDetails shows all the information of your new deployment. hen the deployment completes, your modelProvisioning state changes toSucceeded.

Prerequisites

Sign in to the Azure CLI

Sign in to the Azure CLI or selectOpen Cloudshell in the following steps.

Create an Azure resource group

To create an Azure OpenAI resource, you need an Azure resource group. When you create a new resource through the Azure CLI, you can also create a new resource group or instruct Azure to use an existing group. The following example shows how to create a new resource group namedOAIResourceGroup with theaz group create command. The resource group is created in the East US location.

az group create \--name OAIResourceGroup \--location eastus

Create a resource

Use theaz cognitiveservices account create command to create an Azure OpenAI resource in the resource group. In the following example, you create a resource namedMyOpenAIResource in theOAIResourceGroup resource group. When you try the example, update the code to use your desired values for the resource group and resource name, along with your Azure subscription ID<subscriptionID>.

az cognitiveservices account create \--name MyOpenAIResource \--resource-group OAIResourceGroup \--location eastus \--kind OpenAI \--sku s0 \--subscription <subscriptionID>--custom-domain MyOpenAIResource--yes

Retrieve information about the resource

After you create the resource, you can use different commands to find useful information about your Azure OpenAI in Microsoft Foundry Models instance. The following examples demonstrate how to retrieve the REST API endpoint base URL and the access keys for the new resource.

Get the endpoint URL

Use theaz cognitiveservices account show command to retrieve the REST API endpoint base URL for the resource. In this example, we direct the command output through thejq JSON processor to locate the.properties.endpoint value.

When you try the example, update the code to use your values for the resource group<myResourceGroupName> and resource<myResourceName>.

az cognitiveservices account show \--name <myResourceName> \--resource-group  <myResourceGroupName> \| jq -r .properties.endpoint

Get the primary API key

To retrieve the access keys for the resource, use theaz cognitiveservices account keys list command. In this example, we direct the command output through thejq JSON processor to locate the.key1 value.

When you try the example, update the code to use your values for the resource group and resource.

az cognitiveservices account keys list \--name <myResourceName> \--resource-group  <myResourceGroupName> \| jq -r .key1

Deploy a model

To deploy a model, use theaz cognitiveservices account deployment create command. In the following example, you deploy an instance of thegpt-4o model and give it the nameMyModel. When you try the example, update the code to use your values for the resource group and resource. You don't need to change themodel-version,model-format orsku-capacity, andsku-name values.

az cognitiveservices account deployment create \--name <myResourceName> \--resource-group  <myResourceGroupName> \--deployment-name MyModel \--model-name gpt-4o \--model-version "2024-11-20"  \--model-format OpenAI \--sku-capacity "1" \--sku-name "Standard"

--sku-name accepts the following deployment types:Standard,GlobalBatch,GlobalStandard, andProvisionedManaged. Learn more aboutdeployment type options.

Important

When you access the model via the API, you need to refer to the deployment name rather than the underlying model name in API calls, which is one of thekey differences between OpenAI and Azure OpenAI. OpenAI only requires the model name. Azure OpenAI always requires deployment name, even when using the model parameter. In our docs, we often have examples where deployment names are represented as identical to model names to help indicate which model works with a particular API endpoint. Ultimately your deployment names can follow whatever naming convention is best for your use case.

Delete a model from your resource

You can delete any model deployed from your resource with theaz cognitiveservices account deployment delete command. In the following example, you delete a model namedMyModel. When you try the example, update the code to use your values for the resource group, resource, and deployed model.

az cognitiveservices account deployment delete \--name <myResourceName> \--resource-group  <myResourceGroupName> \--deployment-name MyModel

Delete a resource

If you want to clean up after these exercises, you can remove your Azure OpenAI resource by deleting the resource through the Azure CLI. You can also delete the resource group. If you choose to delete the resource group, all resources contained in the group are also deleted.

To remove the resource group and its associated resources, use theaz cognitiveservices account delete command.

If you're not going to continue to use the resources created in these exercises, run the following command to delete your resource group. Be sure to update the example code to use your values for the resource group and resource.

az cognitiveservices account delete \--name <myResourceName> \--resource-group  <myResourceGroupName>

Prerequisites

Sign in to the Azure PowerShell

Sign in to Azure PowerShell or selectOpen Cloudshell in the following steps.

Create an Azure resource group

To create an Azure OpenAI resource, you need an Azure resource group. When you create a new resource through Azure PowerShell, you can also create a new resource group or instruct Azure to use an existing group. The following example shows how to create a new resource group namedOAIResourceGroup with theNew-AzResourceGroup command. The resource group is created in the East US location.

New-AzResourceGroup -Name OAIResourceGroup -Location eastus

Create a resource

Use theNew-AzCognitiveServicesAccount command to create an Azure OpenAI resource in the resource group. In the following example, you create a resource namedMyOpenAIResource in theOAIResourceGroup resource group. When you try the example, update the code to use your desired values for the resource group and resource name, along with your Azure subscription ID<subscriptionID>.

New-AzCognitiveServicesAccount -ResourceGroupName OAIResourceGroup -Name MyOpenAIResource -Type OpenAI -SkuName S0 -Location eastus

Retrieve information about the resource

After you create the resource, you can use different commands to find useful information about your Azure OpenAI in Microsoft Foundry Models instance. The following examples demonstrate how to retrieve the REST API endpoint base URL and the access keys for the new resource.

Get the endpoint URL

Use theGet-AzCognitiveServicesAccount command to retrieve the REST API endpoint base URL for the resource. In this example, we direct the command output through theSelect-Object cmdlet to locate theendpoint value.

When you try the example, update the code to use your values for the resource group<myResourceGroupName> and resource<myResourceName>.

Get-AzCognitiveServicesAccount -ResourceGroupName OAIResourceGroup -Name MyOpenAIResource |  Select-Object -Property endpoint

Get the primary API key

To retrieve the access keys for the resource, use theGet-AzCognitiveServicesAccountKey command. In this example, we direct the command output through theSelect-Object cmdlet to locate theKey1 value.

When you try the example, update the code to use your values for the resource group and resource.

Get-AzCognitiveServicesAccountKey -Name MyOpenAIResource -ResourceGroupName OAIResourceGroup |  Select-Object -Property Key1

Deploy a model

To deploy a model, use theNew-AzCognitiveServicesAccountDeployment command. In the following example, you deploy an instance of thegpt-4o model and give it the nameMyModel. When you try the example, update the code to use your values for the resource group and resource. You don't need to change themodel-version,model-format orsku-capacity, andsku-name values.

$model = New-Object -TypeName 'Microsoft.Azure.Management.CognitiveServices.Models.DeploymentModel' -Property @{    Name = 'gpt-4o'    Version = '2024-11-20'    Format = 'OpenAI'}$properties = New-Object -TypeName 'Microsoft.Azure.Management.CognitiveServices.Models.DeploymentProperties' -Property @{    Model = $model}$sku = New-Object -TypeName "Microsoft.Azure.Management.CognitiveServices.Models.Sku" -Property @{    Name = 'Standard'    Capacity = '1'}New-AzCognitiveServicesAccountDeployment -ResourceGroupName OAIResourceGroup -AccountName MyOpenAIResource -Name MyModel -Properties $properties -Sku $sku

TheName property of the$sku variable accepts the following deployment types:Standard,GlobalBatch,GlobalStandard, andProvisionedManaged. Learn more aboutdeployment type options.

Important

When you access the model via the API, you need to refer to the deployment name rather than the underlying model name in API calls, which is one of thekey differences between OpenAI and Azure OpenAI. OpenAI only requires the model name. Azure OpenAI always requires deployment name, even when using the model parameter. In our docs, we often have examples where deployment names are represented as identical to model names to help indicate which model works with a particular API endpoint. Ultimately your deployment names can follow whatever naming convention is best for your use case.

Delete a model from your resource

You can delete any model deployed from your resource with theRemove-AzCognitiveServicesAccountDeployment command. In the following example, you delete a model namedMyModel. When you try the example, update the code to use your values for the resource group, resource, and deployed model.

Remove-AzCognitiveServicesAccountDeployment -ResourceGroupName OAIResourceGroup -AccountName MyOpenAIResource -Name MyModel

Delete a resource

If you want to clean up after these exercises, you can remove your Azure OpenAI resource by deleting the resource through the Azure PowerShell. You can also delete the resource group. If you choose to delete the resource group, all resources contained in the group are also deleted.

To remove the resource group and its associated resources, use theRemove-AzCognitiveServicesAccount command.

If you're not going to continue to use the resources created in these exercises, run the following command to delete your resource group. Be sure to update the example code to use your values for the resource group and resource.

Remove-AzCognitiveServicesAccount -Name MyOpenAIResource -ResourceGroupName OAIResourceGroup

Next steps


Feedback

Was this page helpful?

YesNoNo

Need help with this topic?

Want to try using Ask Learn to clarify or guide you through this topic?

Suggest a fix?

  • Last updated on

In this article

Was this page helpful?

YesNo
NoNeed help with this topic?

Want to try using Ask Learn to clarify or guide you through this topic?

Suggest a fix?