Configure access to a source: Microsoft Azure Storage

Before transferring data from an Azure Storage bucket, you must configureaccess to that bucket so that Storage Transfer Service can retrieve its objects.

Storage Transfer Service supports the following Azure authentication methods:

  • Shared access signature (SAS) tokens. SAS tokens can be specified directly when creating a transferjob, or can be stored in Secret Manager.

  • Azure Shared Keys can be stored in Secret Manager and the secret passed when creatinga transfer job.

  • Federated credentials are passed in afederatedIdentityConfig object during transfer jobcreation.

This document also includes information on adding Storage Transfer Service worker IPaddresses to your Azure Storage firewall to allow access. SeeIP restrictions for details.

Supported regions

Storage Transfer Service is able to transfer data from the following Microsoft Azure Storage regions:
  • Americas: East US, East US 2, West US, West US 2, West US 3, Central US, North Central US, South Central US, West Central US, Canada Central, Canada East, Brazil South
  • Asia-Pacific: Australia Central, Australia East, Australia Southeast, Central India, South India, West India, Southeast Asia, East Asia, Japan East, Japan West, Korea South, Korea Central
  • Europe, Middle East, Africa (EMEA): France Central, Germany West Central, Norway East, Sweden Central, Switzerland North, North Europe, West Europe, UK South, UK West, Qatar Central, UAE North, South Africa North

Option 1: Authenticate using an SAS token

Follow these steps to configure access to a Microsoft Azure Storage container usingan SAS token. You can alternatively save your SAS token inSecret Manager; to do so, follow the instructions inAuthenticate using an Azure Shared Key or SAS token in Secret Manager.

  1. Create or use an existing Microsoft Azure Storage user to access the storageaccount for your Microsoft Azure Storage Blob container.

  2. Create an SAS token at the container level. SeeGrant limited access to Azure Storage resources using shared access signatures for instructions.

    1. TheAllowed services must includeBlob.

    2. ForAllowed resource types select bothContainer andObject.

    3. TheAllowed permissions must includeRead andList. If thetransfer is configured todelete objects from source, you must alsoincludeDelete permission.

    4. The default expiration time for SAS tokens is 8 hours. Set a reasonableexpiration time that enables you to successfully complete your transfer.

    5. Do not specify any IP addresses in theAllowed IP addresses field.Storage Transfer Service uses various IP addresses and doesn't support IPaddress restriction.

    6. TheAllowed protocols should beHTTPS only.

  3. Once the token is created, note theSAS token value that is returned.You need this value when configuring your transfer with Storage Transfer Service.

Caution: Basic SAS tokens can't be revoked, and the only way to invalidate a basic SAS token is to remove the storage access key of your account. We strongly recommend that you create SAS tokens from stored access policies, so that you can revoke a policy to invalidate an SAS token. For more information, seeBest practices when using SAS.

Option 2: Authenticate using an Azure Shared Key or SAS token in Secret Manager

Secret Manager is a secure service that storesand manages sensitive data such as passwords. It uses strong encryption,role-based access control, and audit logging to protect your secrets.

Storage Transfer Service supports Secret Manager resource names that referenceyour securely stored Azure credentials.

To use an Azure Shared Key, you must save the key inSecret Manager. SAS tokens can be saved in Secret Manager orpassed directly.

When you specify a Shared Key, Storage Transfer Service uses that key to generate aservice SASthat is restricted in scope to the Azure container specified in the transferjob.

Enable the API

Enable the Secret Manager API.

Roles required to enable APIs

To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

Enable the API

Configure additional permissions

User permissions

The user creating the secret requires the following role:

  • Secret Manager Admin (roles/secretmanager.admin)

Learn how togrant a role.

Service agent permissions

The Storage Transfer Service service agent requires the following IAM role:

  • Secret Manager Secret Accessor (roles/secretmanager.secretAccessor)

To grant the role to your service agent:

Cloud console

  1. Follow the instructions toretrieve your service agent email.

  2. Go to theIAM page in the Google Cloud console.

    Go to IAM

  3. ClickGrant access.

  4. In theNew principals text box, enter the service agent email.

  5. In theSelect a role drop-down, search for and selectSecret ManagerSecret Accessor.

  6. ClickSave.

gcloud

Use thegcloud projects add-iam-policy-binding command to add the IAMrole to your service agent.

  1. Follow the instructions toretrieve your service agent email.

  2. From the command line, enter the following command:

    gcloudprojectsadd-iam-policy-bindingPROJECT_ID\--member='serviceAccount:SERVICE_AGENT_EMAIL'\--role='roles/secretmanager.secretAccessor'

Create a secret

Create a secret with Secret Manager:

Cloud console

  1. Go to theSecret Manager page in the Google Cloud console.

    Go to Secret Manager

  2. ClickCreate secret.

  3. Enter a name.

  4. In theSecret value text box, enter your credentials in one of thefollowing formats.

    {"sas_token":"SAS_TOKEN_VALUE"}

    Or:

    {"access_key":"ACCESS_KEY"}
  5. ClickCreate secret.

  6. Once the secret has been created, note the secret's full resource name:

    1. Select theOverview tab.

    2. Copy the value ofResource name. It uses the following format:

      projects/1234567890/secrets/SECRET_NAME

gcloud

To create a new secret using the gcloud command-line tool, pass theJSON-formatted credentials to thegcloud secrets create command:

printf'{  "sas_token" : "SAS_TOKEN_VALUE"}'|gcloudsecretscreateSECRET_NAME--data-file=-

Or:

printf'{  "access_key" : "ACCESS_KEY"}'|gcloudsecretscreateSECRET_NAME--data-file=-

Retrieve the secret's full resource name:

gcloudsecretsdescribeSECRET_NAME

Note the value ofname in the response. It uses the following format:

projects/1234567890/secrets/SECRET_NAME

For more details about creating and managing secrets, refer to theSecret Manager documentation.

Pass your secret to the job creation command

Using Secret Manager withStorage Transfer Service requires using the REST API to create a transfer job.

Pass the Secret Manager resource name as the value of thetransferSpec.azureBlobStorageDataSource.credentialsSecret field:

POSThttps://storagetransfer.googleapis.com/v1/transferJobs{"description":"Transfer with Secret Manager","status":"ENABLED","projectId":"PROJECT_ID","transferSpec":{"azureBlobStorageDataSource":{"storageAccount":"AZURE_STORAGE_ACCOUNT_NAME","container":"AZURE_CONTAINER_NAME","credentialsSecret":"SECRET_RESOURCE_ID",},"gcsDataSink":{"bucketName":"CLOUD_STORAGE_BUCKET_NAME"}}}

SeeCreate transfers for fulldetails about creating a transfer.

Option 3: Authenticate using federated identity

Storage Transfer Service supports Azure workload identity federation withGoogle Cloud. Storage Transfer Service can issue requests to Azure Storagethrough registered Azure applications, eliminating the need topass credentials to Storage Transfer Service directly.

To configure federated identity, follow these instructions.

Configure Google Cloud credentials

You must add theService Account Token Creator(roles/iam.serviceAccountTokenCreator) role to the Storage Transfer Serviceservice agent to allow creating OpenID Connect (OIDC) ID tokens for theaccount.

  1. Retrieve theaccountEmail andsubjectId of the Google-managedservice agent that is automatically created when you start usingStorage Transfer Service. To retrieve these values:

    1. Go to thegoogleServiceAccounts.get reference page.

      An interactive panel opens, titledTry this method.

    2. In the panel, underRequest parameters, enter your project ID. The project you specify here must be the project you're using to manage Storage Transfer Service.

    3. ClickExecute. TheaccountEmail andsubjectId are included in the response. Save these values.

  2. Grant theService Account Token Creator(roles/iam.serviceAccountTokenCreator) role to the Storage Transfer Serviceservice agent. Follow the instructions inManage access to service accounts.

Configure Microsoft credentials

First, register an application and add a federated credential:

  1. Sign in tohttps://portal.azure.com.
  2. Go to theApp registrations page.
  3. ClickNew registration.
  4. Enter a name. For example,azure-transfer-app.
  5. SelectAccounts in this organizational directory only.
  6. ClickRegister. The application is created. Note theApplication (client) ID and theDirectory (tenant) ID. You can alsoretrieve these later from the application'sOverview page.
  7. ClickCertificates & secrets and select theFederated credentialstab.
  8. ClickAdd credential.
  9. SelectOther issuer as the scenario and enter the following information:
    • Issuer:https://accounts.google.com
    • Subject identifier: ThesubjectId of your service agent, that youretrieved inConfigure Google Cloud credentials.
    • A unique name for the federated credential.
    • Audience must remain asapi://AzureADTokenExchange.
  10. ClickAdd.

Next, grant the application access to your Azure Storage container:

  1. Go to theStorage Accounts page in your Azure account.
  2. Select your storage account and selectContainers from theData storage section.
  3. Click the bucket to which to grant access.
  4. ClickAccess Control (IAM) from the left menu and select theRolestab.
  5. Click the overflow (...) menu next to any role and selectClone.
  6. Enter a name for this custom role and selectStart from scratch. ClickNext.
  7. ClickAdd permissions and search forMicrosoft.Storage/storageAccounts/blobServices/containers/blobs/read.
  8. Click theMicrosoft Storage card that appears.
  9. Select theData actions radio button.
  10. SelectRead : Read Blob.
  11. ClickAdd.
  12. If you will be deleting objects at source after transfer, clickAdd permissions again and search forMicrosoft.Storage/storageAccounts/blobServices/containers/blobs/delete.
  13. Click theMicrosoft Storage card that appears, selectData actions,and selectDelete : Delete blob.
  14. ClickAdd.
  15. ClickReview + create, thenCreate. You are returned to the bucket'sAccess Control (IAM) page.
  16. ClickAdd and selectAdd role assignment.
  17. From the list of roles, select your custom role and clickNext.
  18. ClickSelect members.
  19. In theSelect field, enter the name of the application that youpreviously registered. For example,azure-transfer-app.
  20. Click the application tile and clickSelect.
  21. ClickReview + assign.

Pass your application identifiers to the job creation command

Your application's identifiers are passed to the job creation command using afederatedIdentityConfig object. Copy theApplication (client) ID and theDirectory (tenant) ID that you saved during theConfigure Microsoft credentials steps intotheclient_id andtenant_id fields.

"federatedIdentityConfig":{"client_id":"efghe9d8-4810-800b-8f964ed4057f","tenant_id":"abcd1234-c8f0-4cb0-b0c5-ae4aded60078"}

An example job creation request looks like the following:

POSThttps://storagetransfer.googleapis.com/v1/transferJobs{"description":"Transfer with Azure Federated Identity","status":"ENABLED","projectId":"PROJECT_ID","transferSpec":{"azureBlobStorageDataSource":{"storageAccount":"AZURE_STORAGE_ACCOUNT_NAME","container":"AZURE_CONTAINER_NAME","federatedIdentityConfig":{"client_id":"AZURE_CLIENT_ID","tenant_id":"AZURE_TENANT_ID"}},"gcsDataSink":{"bucketName":"CLOUD_STORAGE_BUCKET_NAME"}}}

SeeCreate transfers for fulldetails about creating a transfer.

IP restrictions

If you restrict access to your Azure resources using an Azure Storage firewall,you must add the IP ranges used by Storage Transfer Service workers to your list ofallowed IPs.

Because these IP ranges can change, we publish the current values as a JSONfile at a permanent address:

https://www.gstatic.com/storage-transfer-service/ipranges.json

When a new range is added to the file, we'll wait at least 7 days beforeusing that range for requests from Storage Transfer Service.

We recommend that you pull data from this document at least weekly to keepyour security configuration up to date. For a sample Python script that fetchesIP ranges from a JSON file,see this article from the Virtual Private Clouddocumentation.

To add these ranges as allowed IPs, follow the instructions in the MicrosoftAzure article,Configure Azure Storage firewalls and virtual networks.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.