Export statements in GoogleSQL

EXPORT DATA statement

TheEXPORT DATA statement exports the results of a query to an externalstorage location. You can export to the following services:

  • Cloud Storage
  • Amazon Simple Storage Service (Amazon S3)
  • Azure Blob Storage
  • Spanner
  • Bigtable
  • Pub/Sub

Syntax

EXPORT DATA[WITH CONNECTION connection_name]OPTIONS (export_option_list) ASquery_statement

Arguments

Export to Cloud Storage, Amazon S3, or Blob Storage

You canexport BigQuery data to Cloud Storage, Amazon S3, or Blob Storagein Avro, CSV, JSON, and Parquet formats. For more information about exporting toCloud Storage, seeExport table data to Cloud Storage.

Use theformat option to specify the format of the exported data. Thefollowing limitations apply:

  • You cannot export nested and repeated data in CSV format.
  • If you export data in JSON format,INT64 data types are encoded as JSONstrings to preserve 64-bit precision.

You are not billed for the export operation, but you are billed for running thequery and for storing data in Cloud Storage, Amazon S3, orBlob Storage. For more information, seeCloud Storagepricing,Amazon S3pricing, orBlob Storagepricing.

Cloud Storage, Amazon S3, and Blob Storage export option list

The option list specifies options for exporting to Cloud Storage,Amazon S3, or Blob Storage. Specify the option list in the followingformat:NAME=VALUE, ...

Options
compression

STRING

Specifies a compression format. If not specified, the exported files are uncompressed. Supported values include:GZIP,DEFLATE,SNAPPY.

field_delimiter

STRING

The delimiter used to separate fields. Default:',' (comma).

Applies to: CSV.

format

STRING

Required. The format of the exported data. Supported values include:AVRO,CSV,JSON,PARQUET.

header

BOOL

Iftrue, generates column headers for the first row of each data file. Default:false.

Applies to: CSV.

overwrite

BOOL

Iftrue, overwrites any existing files with the same URI. Otherwise, if the destination storage bucket is not empty, the statement returns an error. Default:false.

Note: Whenoverwrite istrue, files are only overwritten, no files are ever deleted, even if they match the wildcard specified in the URI.

uri

STRING

Required. The destination URI for the export. Theuri option must be a single-wildcard URI as described in Exporting data into one or more files.

Examples:"gs://bucket/path/file_*.csv" or"s3://bucket/path/file_*.csv"

use_avro_logical_types

BOOL

Whether to use appropriate AVRO logical types when exportingTIMESTAMP,DATETIME,TIME andDATE types.

Applies to: AVRO. For more information, seeAvro export details.

Examples

The following examples show common use cases for exporting toCloud Storage, Amazon S3, or Blob Storage.

Export data to Cloud Storage in CSV format

The following example exports data to a CSV file. It includes options tooverwrite the destination location, write header rows, and use';' as adelimiter.

EXPORTDATAOPTIONS(uri='gs://bucket/folder/*.csv',format='CSV',overwrite=true,header=true,field_delimiter=';')ASSELECTfield1,field2FROMmydataset.table1ORDERBYfield1LIMIT10

Export data to Cloud Storage in Avro format

The following example exports data to Avro format using Snappy compression.

EXPORTDATAOPTIONS(uri='gs://bucket/folder/*',format='AVRO',compression='SNAPPY')ASSELECTfield1,field2FROMmydataset.table1ORDERBYfield1LIMIT10

Export data to Cloud Storage in Parquet format

The following example exports data to Parquet format. It includes the option tooverwrite the destination location.

EXPORTDATAOPTIONS(uri='gs://bucket/folder/*',format='PARQUET',overwrite=true)ASSELECTfield1,field2FROMmydataset.table1ORDERBYfield1LIMIT10

Export data to Amazon S3 in JSON format

The following exampleexports query resultsthat run against a BigLake table based on Amazon S3 toyour Amazon S3 bucket:

EXPORTDATAWITHCONNECTIONmyproject.us.myconnectionOPTIONS(uri='s3://bucket/folder/*',format='JSON',overwrite=true)ASSELECTfield1,field2FROMmydataset.table1ORDERBYfield1LIMIT10

Export to Bigtable

You can export BigQuery data to a Bigtable tableby using theEXPORT DATA statement. For Bigtable exportexamples and configuration options, seeExport data to Bigtable.

You are not billed for the export operation, but you are billed forrunning the query and for storing data in Bigtable. For moreinformation, seeBigtable pricing.

Bigtable export option list

The option list specifies options for exporting to Bigtable.Specify the option list in the following format:NAME=VALUE, ...

Options
format

STRING

Required. When exporting to Bigtable, the value must always beCLOUD_BIGTABLE.

bigtable_options

STRING

JSON string containing configurations related to mapping exported fields to Bigtable columns families and columns. For more information, seeConfigure exports withbigtable_options.

overwrite

BOOL

Iftrue, allows export to overwrite existing data in the destination Bigtable table. When set tofalse, and if the destination table is not empty, the statement returns an error. Default:false.

truncate

BOOL

Iftrue, all existing data in the destination table will be deleted before any new data is written. Otherwise the export will proceed with a non-empty destination table. Default:false.

Warning: Only usetruncate when the destination table doesn't contain data that needs to be retained.
uri

STRING

Required. The destination URI for the export. We recommend specifying an app profile for traffic routing and visibility at monitoring dashboards provided by Bigtable. Theuri option for a Bigtable export must be provided in the following format: https://bigtable.googleapis.com/projects/PROJECT_ID/instances/INSTANCE_ID/appProfiles/APP_PROFILE/tables/TABLE_NAME

auto_create_column_families

BOOL

Iftrue, allows export to create missing column families in the target table. Iffalse and if the destination table is missing a column family, the statement returns an error. Default:false.

Example

The following example exports data to a Bigtable table. Data infield1 becomes a row key in Bigtable destination table. Thefieldsfield2,field3 andfield4 are written as columnscbtFeld2,cbtField3 andcbtField4 into column familycolumn_family.

EXPORTDATAOPTIONS(uri="https://bigtable.googleapis.com/projects/my-project/instances/my-instance/tables/my-table",format="CLOUD_BIGTABLE",bigtable_options="""{   "columnFamilies" : [      {        "familyId": "column_family",        "columns": [           {"qualifierString": "cbtField2", "fieldName": "field2"},           {"qualifierString": "cbtField3", "fieldName": "field3"},           {"qualifierString": "cbtField4", "fieldName": "field4"},        ]      }   ]}""")ASSELECTCAST(field1asSTRING)asrowkey,STRUCT(field2,field3,field4)ascolumn_familyFROM`bigquery_table`

Export to Pub/Sub

Preview

This feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of theService Specific Terms. Pre-GA features are available "as is" and might have limited support. For more information, see thelaunch stage descriptions.

Note: To enroll in the continuous queries preview, fill outthe request form.To give feedback or request support for this feature, contactbq-continuous-queries-feedback@google.com.

You can export BigQuery data to a Pub/Sub topicby using theEXPORT DATA statement in acontinuous query. Formore information about Pub/Sub configuration options, seeExport data to Pub/Sub.

For information about the costs involved with exporting to Pub/Subby using a continuous query, seeCosts.

Pub/Sub export option list

The option list specifies options for exporting to Pub/Sub.Specify the option list in the following format:NAME=VALUE, ...

Options
format

STRING

Required. When exporting to Pub/Sub, the value must always beCLOUD_PUBSUB.

uri

STRING

Required. The destination URI for the export. Theuri option for a Pub/Sub export must be provided in the following format: https://pubsub.googleapis.com/projects/PROJECT_ID/topics/TOPIC_ID

Example

The following example shows a continuous query that filters data from aBigQuery table that is receiving streaming taxi ride information,and publishes the data to a Pub/Sub topic in real time:

EXPORTDATAOPTIONS(format='CLOUD_PUBSUB',uri='https://pubsub.googleapis.com/projects/myproject/topics/taxi-real-time-rides')AS(SELECTTO_JSON_STRING(STRUCT(ride_id,timestamp,latitude,longitude))ASmessageFROM`myproject.real_time_taxi_streaming.taxi_rides`WHEREride_status='enroute');

Export to Spanner

Preview

This feature is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of theService Specific Terms. Pre-GA features are available "as is" and might have limited support. For more information, see thelaunch stage descriptions.

To provide feedback or request support for this feature, send email tobq-cloud-spanner-federation-preview@google.com.

You can export data from a BigQuery table to aSpanner table by using theEXPORT DATAstatement.

Spanner export option list

The option list specifies options for the export operation. Specify the optionlist in the following format:NAME=VALUE, ...

Options
format

STRING

Required. To export data from BigQuery to Spanner, the value must always beCLOUD_SPANNER.

uri

STRING

Required. The destination URI for the export. For Spanner, the URI must be provided in the following format: https://spanner.googleapis.com/projects/PROJECT_ID/instances/INSTANCE_ID/databases/DATABASE_ID

spanner_options

STRING

Required. A JSON string containing configurations related to mapping exported fields to Spanner column families and columns. For more information, see Configure exports withspanner_options option.

Examples

Export data to Spanner

The following example exports data to a Spanner table:

EXPORTDATAOPTIONS(uri="https://spanner.googleapis.com/projects/my-project/instances/my-instance/databases/my-database",format="CLOUD_SPANNER",spanner_options="""{ "table": "my_table" }""")ASSELECT*FROM`bigquery_table`

For more Spanner export examples and configuration options, seeExport data to Spanner.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-05 UTC.