This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Note
Access to this page requires authorization. You can trysigning in orchanging directories.
Access to this page requires authorization. You can trychanging directories.
This article shows how to upload a blob using theAzure Storage client library for Python. You can upload data to a block blob from a file path, a stream, a binary object, or a text string. You can also upload blobs with index tags.
To learn about uploading blobs using asynchronous APIs, seeUpload blobs asynchronously.
If you don't have an existing project, this section shows you how to set up a project to work with the Azure Blob Storage client library for Python. For more details, seeGet started with Azure Blob Storage and Python.
To work with the code examples in this article, follow these steps to set up your project.
Install the following packages usingpip install:
pip install azure-storage-blob azure-identityAdd the followingimport statements:
import ioimport osimport uuidfrom azure.identity import DefaultAzureCredentialfrom azure.storage.blob import BlobServiceClient, ContainerClient, BlobBlock, BlobClient, StandardBlobTierThe authorization mechanism must have the necessary permissions to upload a blob. For authorization with Microsoft Entra ID (recommended), you need Azure RBAC built-in roleStorage Blob Data Contributor or higher. To learn more, see the authorization guidance forPut Blob (REST API) andPut Block (REST API).
To connect an app to Blob Storage, create an instance ofBlobServiceClient. The following example shows how to create a client object usingDefaultAzureCredential for authorization:
# TODO: Replace <storage-account-name> with your actual storage account nameaccount_url = "https://<storage-account-name>.blob.core.windows.net"credential = DefaultAzureCredential()# Create the BlobServiceClient objectblob_service_client = BlobServiceClient(account_url, credential=credential)You can also create client objects for specificcontainers orblobs, either directly or from theBlobServiceClient object. To learn more about creating and managing client objects, seeCreate and manage client objects that interact with data resources.
To upload a blob using a stream or a binary object, use the following method:
This method creates a new blob from a data source with automatic chunking, meaning that the data source may be split into smaller chunks and uploaded. To perform the upload, the client library may use eitherPut Blob or a series ofPut Block calls followed byPut Block List. This behavior depends on the overall size of the object and how thedata transfer options are set.
Note
The Azure Storage client libraries don't support concurrent writes to the same blob. If your app requires multiple processes writing to the same blob, you should implement a strategy for concurrency control to provide a predictable experience. To learn more about concurrency strategies, seeManage concurrency in Blob Storage.
The following example uploads a file to a block blob using aBlobClient object:
def upload_blob_file(self, blob_service_client: BlobServiceClient, container_name: str): container_client = blob_service_client.get_container_client(container=container_name) with open(file=os.path.join('filepath', 'filename'), mode="rb") as data: blob_client = container_client.upload_blob(name="sample-blob.txt", data=data, overwrite=True)The following example creates random bytes of data and uploads aBytesIO object to a block blob using aBlobClient object:
def upload_blob_stream(self, blob_service_client: BlobServiceClient, container_name: str): blob_client = blob_service_client.get_blob_client(container=container_name, blob="sample-blob.txt") input_stream = io.BytesIO(os.urandom(15)) blob_client.upload_blob(input_stream, blob_type="BlockBlob")The following example uploads binary data to a block blob using aBlobClient object:
def upload_blob_data(self, blob_service_client: BlobServiceClient, container_name: str): blob_client = blob_service_client.get_blob_client(container=container_name, blob="sample-blob.txt") data = b"Sample data for blob" # Upload the blob data - default blob type is BlockBlob blob_client.upload_blob(data, blob_type="BlockBlob")The following example uploads a block blob with index tags:
def upload_blob_tags(self, blob_service_client: BlobServiceClient, container_name: str): container_client = blob_service_client.get_container_client(container=container_name) sample_tags = {"Content": "image", "Date": "2022-01-01"} with open(file=os.path.join('filepath', 'filename'), mode="rb") as data: blob_client = container_client.upload_blob(name="sample-blob.txt", data=data, tags=sample_tags)You can define client library configuration options when uploading a blob. These options can be tuned to improve performance, enhance reliability, and optimize costs. The following code examples show how to define configuration options for an upload both at the method level, and at the client level when instantiatingBlobClient. These options can also be configured for aContainerClient instance or aBlobServiceClient instance.
You can set configuration options when instantiating a client to optimize performance for data transfer operations. You can pass the following keyword arguments when constructing a client object in Python:
max_block_size - The maximum chunk size for uploading a block blob in chunks. Defaults to 4 MiB.max_single_put_size - If the blob size is less than or equal tomax_single_put_size, the blob is uploaded with a singlePut Blob request. If the blob size is larger thanmax_single_put_size or unknown, the blob is uploaded in chunks usingPut Block and committed usingPut Block List. Defaults to 64 MiB.For more information on transfer size limits for Blob Storage, seeScale targets for Blob storage.
For upload operations, you can also pass themax_concurrency argument when callingupload_blob. This argument defines the maximum number of parallel connections to use when the blob size exceeds 64 MiB.
The following code example shows how to specify data transfer options when creating aBlobClient object, and how to upload data using that client object. The values provided in this sample aren't intended to be a recommendation. To properly tune these values, you need to consider the specific needs of your app.
def upload_blob_transfer_options(self, account_url: str, container_name: str, blob_name: str): # Create a BlobClient object with data transfer options for upload blob_client = BlobClient( account_url=account_url, container_name=container_name, blob_name=blob_name, credential=DefaultAzureCredential(), max_block_size=1024*1024*4, # 4 MiB max_single_put_size=1024*1024*8 # 8 MiB ) with open(file=os.path.join(r'file_path', blob_name), mode="rb") as data: blob_client = blob_client.upload_blob(data=data, overwrite=True, max_concurrency=2)To learn more about tuning data transfer options, seePerformance tuning for uploads and downloads with Python.
You can set a blob's access tier on upload by passing thestandard_blob_tier keyword argument toupload_blob. Azure Storage offers different access tiers so that you can store your blob data in the most cost-effective manner based on how it's being used.
The following code example shows how to set the access tier when uploading a blob:
def upload_blob_access_tier(self, blob_service_client: BlobServiceClient, container_name: str, blob_name: str): blob_client = blob_service_client.get_blob_client(container=container_name, blob=blob_name) #Upload blob to the cool tier with open(file=os.path.join(r'file_path', blob_name), mode="rb") as data: blob_client = blob_client.upload_blob(data=data, overwrite=True, standard_blob_tier=StandardBlobTier.COOL)Setting the access tier is only allowed for block blobs. You can set the access tier for a block blob toHot,Cool,Cold, orArchive. To set the access tier toCold, you must use a minimumclient library version of 12.15.0.
To learn more about access tiers, seeAccess tiers overview.
You can have greater control over how to divide uploads into blocks by manually staging individual blocks of data. When all of the blocks that make up a blob are staged, you can commit them to Blob Storage.
Use the following method to create a new block to be committed as part of a blob:
Use the following method to write a blob by specifying the list of block IDs that make up the blob:
The following example reads data from a file and stages blocks to be committed as part of a blob:
def upload_blocks(self, blob_container_client: ContainerClient, local_file_path: str, block_size: int): file_name = os.path.basename(local_file_path) blob_client = blob_container_client.get_blob_client(file_name) with open(file=local_file_path, mode="rb") as file_stream: block_id_list = [] while True: buffer = file_stream.read(block_size) if not buffer: break block_id = uuid.uuid4().hex block_id_list.append(BlobBlock(block_id=block_id)) blob_client.stage_block(block_id=block_id, data=buffer, length=len(buffer)) blob_client.commit_block_list(block_id_list)The Azure Blob Storage client library for Python supports uploading blobs asynchronously. To learn more about project setup requirements, seeAsynchronous programming.
Follow these steps to upload a blob using asynchronous APIs:
Add the following import statements:
import asynciofrom azure.identity.aio import DefaultAzureCredentialfrom azure.storage.blob.aio import BlobServiceClient, BlobClient, ContainerClientAdd code to run the program usingasyncio.run. This function runs the passed coroutine,main() in our example, and manages theasyncio event loop. Coroutines are declared with the async/await syntax. In this example, themain() coroutine first creates the top levelBlobServiceClient usingasync with, then calls the method that uploads the blob. Note that only the top level client needs to useasync with, as other clients created from it share the same connection pool.
async def main(): sample = BlobSamples() # TODO: Replace <storage-account-name> with your actual storage account name account_url = "https://<storage-account-name>.blob.core.windows.net" credential = DefaultAzureCredential() async with BlobServiceClient(account_url, credential=credential) as blob_service_client: await sample.upload_blob_file(blob_service_client, "sample-container")if __name__ == '__main__': asyncio.run(main())Add code to upload the blob. The following example uploads a blob from a local file path using aContainerClient object. The code is the same as the synchronous example, except that the method is declared with theasync keyword and theawait keyword is used when calling theupload_blob method.
async def upload_blob_file(self, blob_service_client: BlobServiceClient, container_name: str): container_client = blob_service_client.get_container_client(container=container_name) with open(file=os.path.join('filepath', 'filename'), mode="rb") as data: blob_client = await container_client.upload_blob(name="sample-blob.txt", data=data, overwrite=True)With this basic setup in place, you can implement other examples in this article as coroutines using async/await syntax.
To learn more about uploading blobs using the Azure Blob Storage client library for Python, see the following resources.
The Azure SDK for Python contains libraries that build on top of the Azure REST API, allowing you to interact with REST API operations through familiar Python paradigms. The client library methods for uploading blobs use the following REST API operations:
Was this page helpful?
Need help with this topic?
Want to try using Ask Learn to clarify or guide you through this topic?
Was this page helpful?
Want to try using Ask Learn to clarify or guide you through this topic?