Export a table to a JSON file

Exports a table to a newline-delimited JSON file in a Cloud Storage bucket.

Code sample

C#

Before trying this sample, follow theC# setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryC# API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.

usingGoogle.Cloud.BigQuery.V2;usingSystem;publicclassBigQueryExtractTableJson{publicvoidExtractTableJson(stringprojectId="your-project-id",stringbucketName="your-bucket-name"){BigQueryClientclient=BigQueryClient.Create(projectId);stringdestinationUri=$"gs://{bucketName}/shakespeare.json";varjobOptions=newCreateExtractJobOptions(){DestinationFormat=FileFormat.NewlineDelimitedJson};BigQueryJobjob=client.CreateExtractJob(projectId:"bigquery-public-data",datasetId:"samples",tableId:"shakespeare",destinationUri:destinationUri,options:jobOptions);job=job.PollUntilCompleted().ThrowOnAnyError();// Waits for the job to complete.Console.Write($"Exported table to {destinationUri}.");}}

Go

Before trying this sample, follow theGo setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryGo API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.

import("context""fmt""cloud.google.com/go/bigquery")// exportTableAsJSON demonstrates using an export job to// write the contents of a table into Cloud Storage as newline delimited JSON.funcexportTableAsJSON(projectID,gcsURIstring)error{// projectID := "my-project-id"// gcsURI := "gs://mybucket/shakespeare.json"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %w",err)}deferclient.Close()srcProject:="bigquery-public-data"srcDataset:="samples"srcTable:="shakespeare"gcsRef:=bigquery.NewGCSReference(gcsURI)gcsRef.DestinationFormat=bigquery.JSONextractor:=client.DatasetInProject(srcProject,srcDataset).Table(srcTable).ExtractorTo(gcsRef)// You can choose to run the job in a specific location for more complex data locality scenarios.// Ex: In this example, source dataset and GCS bucket are in the US.extractor.Location="US"job,err:=extractor.Run(ctx)iferr!=nil{returnerr}status,err:=job.Wait(ctx)iferr!=nil{returnerr}iferr:=status.Err();err!=nil{returnerr}returnnil}

Java

Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.

importcom.google.cloud.RetryOption;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.FormatOptions;importcom.google.cloud.bigquery.Job;importcom.google.cloud.bigquery.Table;importcom.google.cloud.bigquery.TableId;importorg.threeten.bp.Duration;publicclassExtractTableToJson{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.StringprojectId="bigquery-public-data";StringdatasetName="samples";StringtableName="shakespeare";StringbucketName="my-bucket";StringdestinationUri="gs://"+bucketName+"/path/to/file";// For more information on export formats available see:// https://cloud.google.com/bigquery/docs/exporting-data#export_formats_and_compression_types// For more information on Job see:// https://googleapis.dev/java/google-cloud-clients/latest/index.html?com/google/cloud/bigquery/package-summary.html// Note that FormatOptions.json().toString() is not "JSON" but "NEWLINE_DELIMITED_JSON"// Using FormatOptions Enum for this will prevent problems with unexpected format names.StringdataFormat=FormatOptions.json().getType();extractTableToJson(projectId,datasetName,tableName,destinationUri,dataFormat);}// Exports datasetName:tableName to destinationUri as a JSON filepublicstaticvoidextractTableToJson(StringprojectId,StringdatasetName,StringtableName,StringdestinationUri,StringdataFormat){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();TableIdtableId=TableId.of(projectId,datasetName,tableName);Tabletable=bigquery.getTable(tableId);Jobjob=table.extract(dataFormat,destinationUri);// Blocks until this job completes its execution, either failing or succeeding.JobcompletedJob=job.waitFor(RetryOption.initialRetryDelay(Duration.ofSeconds(1)),RetryOption.totalTimeout(Duration.ofMinutes(3)));if(completedJob==null){System.out.println("Job not executed since it no longer exists.");return;}elseif(completedJob.getStatus().getError()!=null){System.out.println("BigQuery was unable to extract due to an error: \n"+job.getStatus().getError());return;}System.out.println("Table export successful. Check in GCS bucket for the "+dataFormat+" file.");}catch(BigQueryException|InterruptedExceptione){System.out.println("Table extraction job was interrupted. \n"+e.toString());}}}

Node.js

Before trying this sample, follow theNode.js setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryNode.js API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.

// Import the Google Cloud client librariesconst{BigQuery}=require('@google-cloud/bigquery');const{Storage}=require('@google-cloud/storage');constbigquery=newBigQuery();conststorage=newStorage();asyncfunctionextractTableJSON(){// Exports my_dataset:my_table to gcs://my-bucket/my-file as JSON./**   * TODO(developer): Uncomment the following lines before running the sample.   */// const datasetId = "my_dataset";// const tableId = "my_table";// const bucketName = "my-bucket";// const filename = "file.json";// Location must match that of the source table.constoptions={format:'json',location:'US',};// Export data from the table into a Google Cloud Storage fileconst[job]=awaitbigquery.dataset(datasetId).table(tableId).extract(storage.bucket(bucketName).file(filename),options);console.log(`Job${job.id} created.`);// Check the job's status for errorsconsterrors=job.status.errors;if(errors &&errors.length >0){throwerrors;}}

Python

Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.

# from google.cloud import bigquery# client = bigquery.Client()# bucket_name = 'my-bucket'destination_uri="gs://{}/{}".format(bucket_name,"shakespeare.json")dataset_ref=bigquery.DatasetReference(project,dataset_id)table_ref=dataset_ref.table("shakespeare")job_config=bigquery.job.ExtractJobConfig()job_config.destination_format=bigquery.DestinationFormat.NEWLINE_DELIMITED_JSONextract_job=client.extract_table(table_ref,destination_uri,job_config=job_config,# Location must match that of the source table.location="US",)# API requestextract_job.result()# Waits for job to complete.

What's next

To search and filter code samples for other Google Cloud products, see theGoogle Cloud sample browser.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.