Export query results to Blob Storage
Important: The term "BigLake" on this page refers to an accessdelegation functionality for external tables in BigQuery. Forinformation about BigLake, the stand-alone Google Cloudproduct that includes BigLake metastore, the Apache Iceberg REST catalog,and BigLake tables for Apache Iceberg seeBigLake overview.This document describes how to export the result of a query that runs against aBigLake table to your Azure Blob Storage.
For information about how data flows between BigQuery and Azure Blob Storage,seeData flow when exporting data.
Limitations
For a full list of limitations that apply to BigLake tablesbased on Amazon S3 and Blob Storage, seeLimitations.
Before you begin
Ensure that you have the following resources:
- Aconnection to access your Blob Storage.Within the connection, you must create a policy for the Blob Storagecontainer path that you want to export to. Then, within that policy,create a role that has the
Microsoft.Storage/storageAccounts/blobServices/containers/writepermission. - AnBlob Storage BigLake table.
- If you are on thecapacity-based pricing model, thenensure that you have enabled theBigQuery Reservation API for your project. For information about pricing, seeBigQuery Omni pricing.
Export query results
BigQuery Omni writes to the specified Blob Storage location regardless of any existingcontent. The export query can overwrite existing data or mix the query resultwith existing data. We recommend that you export the query result to an empty Blob Storage container.
In the Google Cloud console, go to theBigQuery page.
In theQuery editor field, enter a GoogleSQL export query:
EXPORTDATAWITHCONNECTION\`CONNECTION_REGION.CONNECTION_NAME\`OPTIONS(uri="azure://AZURE_STORAGE_ACCOUNT_NAME.blob.core.windows.net/CONTAINER_NAME/FILE_PATH/*",format="FORMAT")ASQUERY
Replace the following:
CONNECTION_REGION: the region where theconnection was created.CONNECTION_NAME: the connection name that youcreated with the necessary permission to write to the container.AZURE_STORAGE_ACCOUNT_NAME: the name of theBlob Storage account to which you want to write the query result.CONTAINER_NAME: the name of the container towhich you want to write the query result.FILE_PATH: the path where you want to write theexported file to. It must contain exactly one wildcard*anywhere inthe leaf directory of the path string, for example,../aa/*,../aa/b*c,../aa/*bc, and../aa/bc*. BigQueryreplaces*with0000..Ndepending on the number of files exported.BigQuery determines the file count and sizes. IfBigQuery decides to export two files, then*in the firstfile's filename is replaced by000000000000, and*in thesecond file's filename is replaced by000000000001.FORMAT: supported formats areJSON,AVRO,CSV, andPARQUET.QUERY: the query to analyze the data that isstored in a BigLake table.
--project_id=PROJECT_ID parameter. ReplacePROJECT_ID with the ID of your Google Cloud project.Troubleshooting
If you get an error related toquota failure, then check if you have reservedcapacity for your queries. For more information about slot reservations, seeBefore you begin in this document.
What's next
- Learn aboutBigQuery Omni.
- Learn how toexport table data.
- Learn how toquery data stored in Blob Storage.
- Learn how toset up VPC Service Controls for BigQuery Omni.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-18 UTC.