Export query results to Amazon S3
Important: The term "BigLake" on this page refers to an accessdelegation functionality for external tables in BigQuery. Forinformation about BigLake, the stand-alone Google Cloudproduct that includes BigLake metastore, the Apache Iceberg REST catalog,and BigLake tables for Apache Iceberg seeBigLake overview.This document describes how to export the result of a query that runs against aBigLake table to your Amazon Simple Storage Service (Amazon S3) bucket.
For information about how data flows between BigQuery and Amazon S3,seeData flow when exporting data.
Limitations
For a full list of limitations that apply to BigLake tablesbased on Amazon S3 and Blob Storage, seeLimitations.
Before you begin
Ensure that you have the following resources:
- Aconnection to access your Amazon S3 bucket.
- AnAmazon S3 BigLake table.
- The correct Amazon Web Services (AWS) Identity and Access Management (IAM)policy:
- You must have the
PutObjectpermission to write data into the Amazon S3 bucket.For more information, seeCreate an AWS IAM policy for BigQuery.
- You must have the
- If you are on thecapacity-based pricing model, thenensure that you have enabled theBigQuery Reservation API for your project. For information about pricing, seeBigQuery Omni pricing.
Export query results
BigQuery Omni writes to the specified Amazon S3 location regardless of any existingcontent. The export query can overwrite existing data or mix the query resultwith existing data. We recommend that you export the query result to an empty Amazon S3 bucket.
To run a query, select one of the following options:
SQL
In theQuery editor field, enter a GoogleSQL export query.GoogleSQL is the default syntax in the Google Cloud console.
Note: To override the default project, use the In the Google Cloud console, go to theBigQuery page. In the query editor, enter the following statement: Replace the following:--project_id=PROJECT_ID parameter. ReplacePROJECT_ID with the ID of your Google Cloud project.EXPORTDATAWITHCONNECTION`CONNECTION_REGION.CONNECTION_NAME`OPTIONS(uri="s3://BUCKET_NAME/PATH",format="FORMAT",...)ASQUERY
CONNECTION_REGION: the region where the connection was created.CONNECTION_NAME: the connection name that you created with the necessary permission to write to the Amazon S3 bucket.BUCKET_NAME: the Amazon S3 bucket where you want to write the data.PATH: the path where you want to write the exported file to. It must contain exactly one wildcard* anywhere in the leaf directory of the path string, for example,../aa/*,../aa/b*c,../aa/*bc, and../aa/bc*. BigQuery replaces* with0000..N depending on the number of files exported. BigQuery determines the file count and sizes. If BigQuery decides to export two files, then* in the first file's filename is replaced by000000000000, and* in the second file's filename is replaced by000000000001.FORMAT: supported formats areJSON,AVRO,CSV, andPARQUET.QUERY: the query to analyze the data that is stored in a BigLake table. The dataset that contains the BigLake table used in the query must be located in the sameAmazon S3 region as the target Amazon S3 bucket.
ClickRun.
For more information about how to run queries, seeRun an interactive query.
Java
Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.QueryJobConfiguration;importcom.google.cloud.bigquery.TableResult;// Sample to export query results to Amazon S3 bucketpublicclassExportQueryResultsToS3{publicstaticvoidmain(String[]args)throwsInterruptedException{// TODO(developer): Replace these variables before running the sample.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";StringexternalTableName="MY_EXTERNAL_TABLE_NAME";// connectionName should be in the format of connection_region.connection_name. e.g.// aws-us-east-1.s3-write-connStringconnectionName="MY_CONNECTION_REGION.MY_CONNECTION_NAME";// destinationUri must contain exactly one * anywhere in the leaf directory of the path string// e.g. ../aa/*, ../aa/b*c, ../aa/*bc, and ../aa/bc*// BigQuery replaces * with 0000..N depending on the number of files exported.// BigQuery determines the file count and sizes.StringdestinationUri="s3://your-bucket-name/*";Stringformat="EXPORT_FORMAT";// Export result of query to find states starting with 'W'Stringquery=String.format("EXPORT DATA WITH CONNECTION `%s` OPTIONS(uri='%s', format='%s') "+"AS SELECT * FROM %s.%s.%s WHERE name LIKE 'W%%'",connectionName,destinationUri,format,projectId,datasetName,externalTableName);exportQueryResultsToS3(query);}publicstaticvoidexportQueryResultsToS3(Stringquery)throwsInterruptedException{try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();TableResultresults=bigquery.query(QueryJobConfiguration.of(query));results.iterateAll().forEach(row->row.forEach(val->System.out.printf("%s,",val.toString())));System.out.println("Query results exported to Amazon S3 successfully.");}catch(BigQueryExceptione){System.out.println("Query not performed \n"+e.toString());}}}Troubleshooting
If you get an error related toquota failure, then check if you have reservedcapacity for your queries. For more information about slot reservations, seeBefore you begin in this document.
What's next
- Learn aboutBigQuery Omni.
- Learn how toexport table data.
- Learn how toquery data stored in Amazon S3.
- Learn how toset up VPC Service Controls for BigQuery Omni.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.