Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

This browser is no longer supported.

Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.

Download Microsoft EdgeMore info about Internet Explorer and Microsoft Edge
Table of contentsExit editor mode

Quickstart: Generate a video with Sora (preview)

Feedback

In this article

In this quickstart, you generate video clips using the Azure OpenAI service. The example uses the Sora model, which is a video generation model that creates realistic and imaginative video scenes from text instructions and/or image or video inputs. This guide shows you how to create a video generation job, poll for its status, and retrieve the generated video.

For more information on video generation, seeVideo generation concepts.

Prerequisites

Microsoft Entra ID prerequisites

For the recommended keyless authentication with Microsoft Entra ID, you need to:

  • Install theAzure CLI used for keyless authentication with Microsoft Entra ID.
  • Assign theCognitive Services User role to your user account. You can assign roles in the Azure portal underAccess control (IAM) >Add role assignment.

Set up

  1. Create a new foldervideo-generation-quickstart and go to the quickstart folder with the following command:

    mkdir video-generation-quickstart && cd video-generation-quickstart
  2. Create a virtual environment. If you already have Python 3.10 or higher installed, you can create a virtual environment using the following commands:

    py -3 -m venv .venv.venv\scripts\activate

    Activating the Python environment means that when you runpython orpip from the command line, you then use the Python interpreter contained in the.venv folder of your application. You can use thedeactivate command to exit the python virtual environment, and can later reactivate it when needed.

    Tip

    We recommend that you create and activate a new Python environment to use to install the packages you need for this tutorial. Don't install packages into your global python installation. You should always use a virtual or conda environment when installing python packages, otherwise you can break your global installation of Python.

  3. For therecommended keyless authentication with Microsoft Entra ID, install theazure-identity package with:

    pip install azure-identity

Retrieve resource information

You need to retrieve the following information to authenticate your application with your Azure OpenAI resource:

Variable nameValue
AZURE_OPENAI_ENDPOINTThis value can be found in theKeys and Endpoint section when examining your resource from the Azure portal.
AZURE_OPENAI_DEPLOYMENT_NAMEThis value will correspond to the custom name you chose for your deployment when you deployed a model. This value can be found underResource Management >Model Deployments in the Azure portal.

Learn more aboutkeyless authentication andsetting environment variables.

Generate video with Sora

You can generate a video with the Sora model by creating a video generation job, polling for its status, and retrieving the generated video. The following code shows how to do this via the REST API using Python.

  1. Create thesora-quickstart.py file and add the following code to authenticate your resource:

    import requestsimport base64 import osfrom azure.identity import DefaultAzureCredential# Set environment variables or edit the corresponding values here.endpoint = os.environ['AZURE_OPENAI_ENDPOINT']# Keyless authenticationcredential = DefaultAzureCredential()token = credential.get_token("https://cognitiveservices.azure.com/.default")api_version = 'preview'headers= { "Authorization": f"Bearer {token.token}", "Content-Type": "application/json" }
  2. Create the video generation job. You can create it from a text prompt only, or from an input image and text prompt.

    # 1. Create a video generation jobcreate_url = f"{endpoint}/openai/v1/video/generations/jobs?api-version={api_version}"body = {    "prompt": "A cat playing piano in a jazz bar.",    "width": 480,    "height": 480,    "n_seconds": 5,    "model": "sora"}response = requests.post(create_url, headers=headers, json=body)response.raise_for_status()print("Full response JSON:", response.json())job_id = response.json()["id"]print(f"Job created: {job_id}")# 2. Poll for job statusstatus_url = f"{endpoint}/openai/v1/video/generations/jobs/{job_id}?api-version={api_version}"status=Nonewhile status not in ("succeeded", "failed", "cancelled"):    time.sleep(5)  # Wait before polling again    status_response = requests.get(status_url, headers=headers).json()    status = status_response.get("status")    print(f"Job status: {status}")# 3. Retrieve generated video if status == "succeeded":    generations = status_response.get("generations", [])    if generations:        print(f"✅ Video generation succeeded.")        generation_id = generations[0].get("id")        video_url = f"{endpoint}/openai/v1/video/generations/{generation_id}/content/video?api-version={api_version}"        video_response = requests.get(video_url, headers=headers)        if video_response.ok:            output_filename = "output.mp4"            with open(output_filename, "wb") as file:                file.write(video_response.content)                print(f'Generated video saved as "{output_filename}"')    else:        raise Exception("No generations found in job result.")else:    raise Exception(f"Job didn't succeed. Status: {status}")
  3. Run the Python file.

    python sora-quickstart.py

    Wait a few moments to get the response.

Output

The output will show the full response JSON from the video generation job creation request, including the job ID and status.

{    "object": "video.generation.job",    "id": "task_01jwcet0eje35tc5jy54yjax5q",    "status": "queued",    "created_at": 1748469875,    "finished_at": null,    "expires_at": null,    "generations": [],    "prompt": "A cat playing piano in a jazz bar.",    "model": "sora",    "n_variants": 1,    "n_seconds": 5,    "height": 480,    "width": 480,    "failure_reason": null}

The generated video will be saved asoutput.mp4 in the current directory.

Job created: task_01jwcet0eje35tc5jy54yjax5qJob status: preprocessingJob status: runningJob status: processingJob status: succeeded✅ Video generation succeeded.Generated video saved as "output.mp4"

Prerequisites

Go to Microsoft Foundry portal

Browse to theFoundry portal and sign in with the credentials associated with your Azure OpenAI resource. During or after the sign-in workflow, select the appropriate directory, Azure subscription, and Azure OpenAI resource.

From the Foundry landing page, create or select a new project. Navigate to theModels + endpoints page on the left nav. SelectDeploy model and then choose the Sora video generation model from the list. Complete the deployment process.

On the model's page, selectOpen in playground.

Try out video generation

Start exploring Sora video generation with a no-code approach through theVideo playground. Enter your prompt into the text box and selectGenerate. When the AI-generated video is ready, it appears on the page.

Note

The content generation APIs come with a content moderation filter. If Azure OpenAI recognizes your prompt as harmful content, it doesn't return a generated video. For more information, seeContent filtering.

In theVideo playground, you can also view Python and cURL code samples, which are prefilled according to your settings. Select the code button at the top of your video playback pane. You can use this code to write an application that completes the same task.

Clean-up resources

If you want to clean up and remove an Azure OpenAI resource, you can delete the resource. Before deleting the resource, you must first delete any deployed models.

Related content


Feedback

Was this page helpful?

YesNoNo

Need help with this topic?

Want to try using Ask Learn to clarify or guide you through this topic?

Suggest a fix?

  • Last updated on

In this article

Was this page helpful?

YesNo
NoNeed help with this topic?

Want to try using Ask Learn to clarify or guide you through this topic?

Suggest a fix?