Manage datasets
This document describes how to copy datasets, recreate datasets in anotherlocation, secure datasets, delete datasets, and restore tables from deleteddatasets in BigQuery. For information about how to restore(orundelete) a deleted dataset, seeRestore deleted datasets.
As a BigQuery administrator, you can organize and control accesstotables andviews thatanalysts use. For more information about datasets,seeIntroduction to datasets.
You cannot change the name of an existing dataset or relocate a dataset afterit's created.As a workaround for changing the dataset name, you cancopy a datasetand change the destination dataset's name. To relocate a dataset, you can followone of the following methods:
Required roles
This section describes the roles and permissions that you need to managedatasets. If your source or destination dataset is in thesame project as the one you are using to copy, then you don't need extrapermissions or roles on that dataset.
Copy a dataset
Grant these roles to copy a dataset. Copying datasets is in(Beta).
To get the permissions that you need to copy datasets, ask your administrator to grant you the following IAM roles:
- BigQuery Admin (
roles/bigquery.admin) - the destination project - BigQuery Data Viewer (
roles/bigquery.dataViewer) - the source dataset - BigQuery Data Editor (
roles/bigquery.dataEditor) - the destination dataset
For more information about granting roles, seeManage access to projects, folders, and organizations.
These predefined roles contain the permissions required to copy datasets. To see the exact permissions that are required, expand theRequired permissions section:
Required permissions
The following permissions are required to copy datasets:
bigquery.transfers.updateon the destination projectbigquery.jobs.createon the destination projectbigquery.datasets.geton the source and destination datasetbigquery.tables.liston the source and destination datasetbigquery.datasets.updateon the destination datasetbigquery.tables.createon the destination dataset
You might also be able to get these permissions withcustom roles or otherpredefined roles.
Delete a dataset
Grant these roles to delete a dataset.
To get the permissions that you need to delete datasets, ask your administrator to grant you theBigQuery Data Owner (roles/bigquery.dataOwner) IAM role on the project. For more information about granting roles, seeManage access to projects, folders, and organizations.
This predefined role contains the permissions required to delete datasets. To see the exact permissions that are required, expand theRequired permissions section:
Required permissions
The following permissions are required to delete datasets:
bigquery.datasets.deleteon the projectbigquery.tables.deleteon the project
You might also be able to get these permissions withcustom roles or otherpredefined roles.
Copy datasets
Beta
This product is subject to the "Pre-GA Offerings Terms" in the General Service Terms section of theService Specific Terms. Pre-GA products are available "as is" and might have limited support. For more information, see thelaunch stage descriptions.
You can copy a dataset, including partitioned data within a region or acrossregions, without extracting, moving, or reloading data into BigQuery.BigQuery uses theBigQuery Data Transfer Servicein the backend to copy datasets. For location considerations when you transferdata, seeData location and transfers.
For each dataset copy configuration, you can have one transfer run active at atime. Additional transfer runs are queued. If you are using the Google Cloud console,you can schedule recurring copies, and configure an email or Pub/Subnotifications with the BigQuery Data Transfer Service.
Limitations
The following limitations apply when you copy datasets:
You can't copy the following resources from a source dataset:
- Views.
- Routines, including UDFs.
- External tables.
- Change data capture (CDC) tables ifthe copy job is across regions. Copying CDC tables within the same regionis supported.
Cross-region table copy job is not supported for tables encrypted withcustomer-managed encrypted keys (CMEK)when the destination dataset is not encrypted with CMEK and there is noCMEK provided. Copying tables with default encryption across regions issupported.
You can copy all encrypted tables within the same region, includingtables encrypted with CMEK.
You can't use the following resources as destination datasets for copy jobs:
- Write-optimized storage.
Dataset encrypted with CMEK if the copy job is across regions and thesource table is not encrypted with CMEK.
However, tables encrypted with CMEK are allowed as destination tableswhen copying within the same region.
The minimum frequency between copy jobs is 12 hours.
Appending data to a partitioned or non-partitioned table in the destinationdataset isn't supported. If there are no changes in the source table, thetable is skipped. If the source table is updated, the destination table iscompletely truncated and replaced.
If a table exists in the source dataset and the destination dataset, and thesource table has not changed since the last successful copy, it's skipped.The source table is skipped even if theOverwrite destination tablescheckbox is selected.
When truncating tables in the destination dataset, the dataset copy jobdoesn't detect any changes made to resources inthe destination dataset before it begins the copy job. The dataset copy joboverwrites all of the data in the destination dataset, including both thetables and schema.
The destination table might not reflect changes made to the source tablesafter a copy job starts.
Copying a dataset is not supported inBigQuery Omni regions.
To copy a dataset to a project in anotherVPC Service Controls service perimeter,you need to set the following egress rules:
In the destination project's VPC Service Controls service perimeter configuration,the IAM principal must have the following methods:
bigquery.datasets.getbigquery.tables.listbigquery.tables.get,bigquery.tables.getData
In the source project's VPC Service Controls service perimeterconfiguration, the IAM principal being used must have the method set to
All Methods.
If you try to update a dataset copy transfer configuration you don't own,the update might fail with the following error message:
Cannot modify restricted parameters without taking ownership of the transfer configuration.The owner of the dataset copy is the user associated with the dataset copyor the user who has access to the service account associated with the dataset copy. The associated user can be seen in theconfiguration details of thedataset copy. For information on how to update the dataset copy to takeownership, seeUpdate credentials.To grant users access to a service account, you must have theService Account user role.
The owner restricted parameters for dataset copies are:
- Source project
- Source dataset
- Destination dataset
- Overwrite destination table setting
Allcross-region table copy limitations apply.
Copy a dataset
Select one of the following options:
Console
Enable the BigQuery Data Transfer Servicefor your destination dataset.
Ensure that you have therequired roles.
If you intend to set up transfer run notifications for Pub/Sub(Option 2 later in these steps), then you must have the
pubsub.topics.setIamPolicypermission.If you only set up email notifications, then Pub/Subpermissions are not required. For more information, see the BigQuery Data Transfer Servicerun notifications.
Create a BigQuery datasetin the same region or a different region from your source dataset.
Option 1: Use the BigQuery copy function
To create a one-time transfer, use the BigQuery copy function:
Go to theBigQuery page.
In the left pane, clickExplorer:

If you don't see the left pane, clickExpand left pane to open the pane.
In theExplorer pane, expand your project, clickDatasets, and then select a dataset.
In theDataset info section, clickCopy,and then do the following:
In theDataset field, either create a new dataset or select anexisting dataset ID from the list.
Dataset names within a project must be unique. The project anddataset can be in different regions, but not all regionsare supported for cross-region dataset copying.
In theLocation field, the location of the source dataset isdisplayed.
Optional: To overwrite both the data and schema of the destination tableswith the source tables, select theOverwrite destination tablescheckbox. Both the source and destination tables must have the samepartitioning schema.
To copy the dataset, clickCopy.
Option 2: Use the BigQuery Data Transfer Service
To schedule recurring copies and configure email or Pub/Subnotifications, use the BigQuery Data Transfer Service in the Google Cloud console of thedestination project:
Go to theData transfers page.
ClickCreate a transfer.
In theSource list, selectDataset Copy.
In theDisplay name field, enter a name for your transfer run.
In theSchedule options section, do the following:
ForRepeat frequency, choose an option for how often to run the transfer:
If you selectCustom, enter a custom frequency—for example,
Note: The minimum frequency between copy jobs is 12 hours.every day 00:00. For more information, seeFormatting the schedule.ForStart date and run time, enter the date and time to startthe transfer. If you chooseStart now, this option is disabled.
In theDestination settings section, select a destination dataset tostore your transfer data. You can also clickCREATE NEW DATASET tocreate a new dataset before you select it for this transfer.
In theData source details section, enter the following information:
- ForSource dataset, enter the dataset ID that you want to copy.
- ForSource project, enter the project ID of your source dataset.
To overwrite both the data and schema of the destination tables with thesource tables, select theOverwrite destination tables checkbox. Boththe source and destination tables must have the same partitioning schema.
In theService Account menu, select aservice accountfrom the service accounts associated with yourGoogle Cloud project. You can associate a service account withyour transfer instead of using your user credentials. For moreinformation about using service accounts with data transfers, seeUse service accounts.
- If you signed in with afederated identity,then a service account is required to create a transfer. If you signedin with aGoogle Account, then aservice account for the transfer is optional.
- The service account must have therequired roles.
Optional: In theNotification options section, do the following:
- To enable email notifications, click the toggle. When you enable thisoption, the owner of the transfer configuration receives an emailnotification when a transfer run fails.
- To enable Pub/Sub notifications, click the toggle, and theneither select atopicname from the list or clickCreate a topic. This optionconfigures Pub/Sub runnotificationsfor your transfer.
ClickSave.
bq
Enable the BigQuery Data Transfer Servicefor your destination dataset.
Ensure that you have therequired roles.
Tocreate a BigQuery dataset,use the
bq mkcommandwith the dataset creation flag--datasetand thelocationflag:bqmk\--dataset\--location=LOCATION\PROJECT:DATASET
Replace the following:
LOCATION: the location where you want to copythe datasetPROJECT: the project ID of your targetdatasetDATASET: the name of thetarget dataset
To copy a dataset, use the
bq mkcommand with the transfer creation flag--transfer_configand the--data_sourceflag.You must set the--data_sourceflag tocross_region_copy. For acomplete list of valid values for the--data_sourceflag,see thetransfer-config flagsin the bq command-line tool reference.bqmk\--transfer_config\--project_id=PROJECT\--data_source=cross_region_copy\--target_dataset=DATASET\--display_name=NAME\--service_account_name=SERCICE_ACCOUNT\--params='PARAMETERS'
Replace the following:
NAME: the display name for the copy job or thetransfer configurationSERVICE_ACCOUNT: the service account name usedto authenticate your transfer. The service account shouldbe owned by the sameproject_idused to create the transfer and itshould have all of therequired permissions.PARAMETERS: the parameters for thetransfer configuration in the JSON formatParameters for a dataset copy configuration include the following:
source_dataset_id: the ID of the source dataset that you want to copysource_project_id: the ID of the project that your source dataset is inoverwrite_destination_table: an optional flag that lets youtruncate the tables of a previous copy and refresh all the data
Both the source and destination tables must have the samepartitioning schema.
The following examples show the formatting of the parameters, basedon your system's environment:
Linux: use single quotes to enclose the JSON string–forexample:
'{"source_dataset_id":"mydataset","source_project_id":"mysourceproject","overwrite_destination_table":"true"}'Windows command line: use double quotes to enclose theJSON string, and escape double quotes in the stringwith a backslash–for example:
"{\"source_dataset_id\":\"mydataset\",\"source_project_id\":\"mysourceproject\",\"overwrite_destination_table\":\"true\"}"PowerShell: use single quotes to enclose the JSON string,and escape double quotes in the string with abackslash–for example:
'{\"source_dataset_id\":\"mydataset\",\"source_project_id\":\"mysourceproject\",\"overwrite_destination_table\":\"true\"}'
For example, the following command creates a dataset copy configurationthat's named
My Transferwith a target dataset that's namedmydatasetand a project with the ID ofmyproject.bqmk\--transfer_config\--project_id=myproject\--data_source=cross_region_copy\--target_dataset=mydataset\--display_name='My Transfer'\--params='{ "source_dataset_id":"123_demo_eu", "source_project_id":"mysourceproject", "overwrite_destination_table":"true" }'
API
Enable the BigQuery Data Transfer Servicefor your destination dataset.
Ensure that you have therequired roles.
Tocreate a BigQuery dataset,call the
datasets.insertmethod with a defineddataset resource.To copy a dataset, use the
projects.locations.transferConfigs.createmethod and supply an instance oftheTransferConfigresource.
Java
Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
importcom.google.api.gax.rpc.ApiException;importcom.google.cloud.bigquery.datatransfer.v1.CreateTransferConfigRequest;importcom.google.cloud.bigquery.datatransfer.v1.DataTransferServiceClient;importcom.google.cloud.bigquery.datatransfer.v1.ProjectName;importcom.google.cloud.bigquery.datatransfer.v1.TransferConfig;importcom.google.protobuf.Struct;importcom.google.protobuf.Value;importjava.io.IOException;importjava.util.HashMap;importjava.util.Map;// Sample to copy dataset from another gcp projectpublicclassCopyDataset{publicstaticvoidmain(String[]args)throwsIOException{// TODO(developer): Replace these variables before running the sample.finalStringdestinationProjectId="MY_DESTINATION_PROJECT_ID";finalStringdestinationDatasetId="MY_DESTINATION_DATASET_ID";finalStringsourceProjectId="MY_SOURCE_PROJECT_ID";finalStringsourceDatasetId="MY_SOURCE_DATASET_ID";Map<String,Value>params=newHashMap<>();params.put("source_project_id",Value.newBuilder().setStringValue(sourceProjectId).build());params.put("source_dataset_id",Value.newBuilder().setStringValue(sourceDatasetId).build());TransferConfigtransferConfig=TransferConfig.newBuilder().setDestinationDatasetId(destinationDatasetId).setDisplayName("Your Dataset Copy Name").setDataSourceId("cross_region_copy").setParams(Struct.newBuilder().putAllFields(params).build()).setSchedule("every 24 hours").build();copyDataset(destinationProjectId,transferConfig);}publicstaticvoidcopyDataset(StringprojectId,TransferConfigtransferConfig)throwsIOException{try(DataTransferServiceClientdataTransferServiceClient=DataTransferServiceClient.create()){ProjectNameparent=ProjectName.of(projectId);CreateTransferConfigRequestrequest=CreateTransferConfigRequest.newBuilder().setParent(parent.toString()).setTransferConfig(transferConfig).build();TransferConfigconfig=dataTransferServiceClient.createTransferConfig(request);System.out.println("Copy dataset created successfully :"+config.getName());}catch(ApiExceptionex){System.out.print("Copy dataset was not created."+ex.toString());}}}Python
Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.pip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.fromgoogle.cloudimportbigquery_datatransfertransfer_client=bigquery_datatransfer.DataTransferServiceClient()destination_project_id="my-destination-project"destination_dataset_id="my_destination_dataset"source_project_id="my-source-project"source_dataset_id="my_source_dataset"transfer_config=bigquery_datatransfer.TransferConfig(destination_dataset_id=destination_dataset_id,display_name="Your Dataset Copy Name",data_source_id="cross_region_copy",params={"source_project_id":source_project_id,"source_dataset_id":source_dataset_id,},schedule="every 24 hours",)transfer_config=transfer_client.create_transfer_config(parent=transfer_client.common_project_path(destination_project_id),transfer_config=transfer_config,)print(f"Created transfer config:{transfer_config.name}")
To avoid additional storage costs, considerdeleting the prior dataset.
View dataset copy jobs
To see the status and view details of a dataset copy job in theGoogle Cloud console, do the following:
In the Google Cloud console, go to theData transfers page.
Select a transfer for which you want to view the transfer details, and then dothe following:
On theTransfer details page, select a transfer run.
To refresh, clickRefresh.
Recreate datasets in another location
To manually move a dataset from one location to another, follow these steps:
Export the data from your BigQuery tables to a Cloud Storage bucket.
There are no charges for exporting data from BigQuery, but you do incur charges forstoring the exported data in Cloud Storage. BigQuery exports are subject to the limits onextract jobs.
Copy or move the data from your export Cloud Storage bucket to a new bucket you created in the destination location. For example, if you are moving your data from the
USmulti-region to theasia-northeast1Tokyo region, you would transfer the data to a bucket that you created in Tokyo. For information about transferring Cloud Storage objects, seeCopy, rename, and move objects in the Cloud Storage documentation.Transferring data between regions incursnetwork egress charges in Cloud Storage.
Create a new BigQuery dataset in the new location, and then load your data from the Cloud Storage bucket into the new dataset.
You are not charged for loading the data into BigQuery, but you will incur charges for storing the data in Cloud Storage until you delete the data or the bucket. You are also charged for storing the data in BigQuery after it is loaded. Loading data into BigQuery is subject to theload jobs limits.
You can also use Cloud Composer to move and copy large datasets programmatically.
For more information about using Cloud Storage to store and move large datasets, seeUse Cloud Storage with big data.
Secure datasets
To control access to datasets in BigQuery, seeControlling access to datasets.For information about data encryption, seeEncryption at rest.
Delete datasets
When you delete a dataset by using the Google Cloud console, tables and viewsin the dataset, including their data, are deleted. When you delete adataset by using the bq command-line tool, you must use the-r flag to delete thetables and views.
Deleting a dataset creates oneaudit log entry forthe dataset deletion. It doesn't create separate log entries for each deletedtable within the dataset.
To delete a dataset, select one of the following options:
Console
Go to theBigQuery page.
In the left pane, clickExplorer:

In theExplorer pane, expand your project, clickDatasets, and then click the dataset.
In the details pane, clickDelete.
In theDelete dataset dialog, type
deleteinto the field, and thenclickDelete.
SQL
To delete a dataset, use theDROP SCHEMA DDL statement.
The following example deletes a dataset namedmydataset:
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
DROPSCHEMAIFEXISTSmydataset;
By default, this only works to delete an empty dataset.To delete a dataset and all of its contents, use the
CASCADEkeyword:DROPSCHEMAIFEXISTSmydatasetCASCADE;
ClickRun.
For more information about how to run queries, seeRun an interactive query.
bq
Use thebq rm commandwith the--dataset or-d flag, which is optional.If your dataset contains tables, you must use the-r flag toremove all tables in the dataset. If you use the-r flag, then you can omitthe--dataset or-d flag.
After you run the command, the system asks for confirmation. You can use the-f flag to skip the confirmation.
If you are deleting a table in a project other than your default project,add the project ID to the dataset name in the following format:PROJECT_ID:DATASET.
bqrm-r-f-dPROJECT_ID:DATASET
Replace the following:
PROJECT_ID: your project IDDATASET: the name of the dataset that you're deleting
Examples:
Enter the following command to remove a dataset that's namedmydataset and allthe tables in it from your default project. The command uses the-d flag.
bq rm -r -d mydataset
When prompted, typey and press enter.
Enter the following command to removemydataset and all the tables in itfrommyotherproject. The command does not use the optional-d flag.The-f flag is used to skip confirmation.
bq rm -r -f myotherproject:mydataset
You can use thebq ls command to confirm that the dataset was deleted.
API
Call thedatasets.delete methodto delete the dataset and set thedeleteContents parameter totrue todelete the tables in it.
C#
The following code sample deletes an empty dataset.
Before trying this sample, follow theC# setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryC# API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.usingGoogle.Cloud.BigQuery.V2;usingSystem;publicclassBigQueryDeleteDataset{publicvoidDeleteDataset(stringprojectId="your-project-id",stringdatasetId="your_empty_dataset"){BigQueryClientclient=BigQueryClient.Create(projectId);// Delete a dataset that does not contain any tablesclient.DeleteDataset(datasetId:datasetId);Console.WriteLine($"Dataset {datasetId} deleted.");}}The following code sample deletes a dataset and all of its contents:
// Copyright(c) 2018 Google LLC//// Licensed under the Apache License, Version 2.0 (the "License"); you may not// use this file except in compliance with the License. You may obtain a copy of// the License at//// http://www.apache.org/licenses/LICENSE-2.0//// Unless required by applicable law or agreed to in writing, software// distributed under the License is distributed on an "AS IS" BASIS, WITHOUT// WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the// License for the specific language governing permissions and limitations under// the License.//usingGoogle.Cloud.BigQuery.V2;usingSystem;publicclassBigQueryDeleteDatasetAndContents{publicvoidDeleteDatasetAndContents(stringprojectId="your-project-id",stringdatasetId="your_dataset_with_tables"){BigQueryClientclient=BigQueryClient.Create(projectId);// Use the DeleteDatasetOptions to delete a dataset and its contentsclient.DeleteDataset(datasetId:datasetId,options:newDeleteDatasetOptions(){DeleteContents=true});Console.WriteLine($"Dataset {datasetId} and contents deleted.");}}Go
Before trying this sample, follow theGo setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryGo API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.import("context""fmt""cloud.google.com/go/bigquery")// deleteDataset demonstrates the deletion of an empty dataset.funcdeleteDataset(projectID,datasetIDstring)error{// projectID := "my-project-id"// datasetID := "mydataset"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %v",err)}deferclient.Close()// To recursively delete a dataset and contents, use DeleteWithContents.iferr:=client.Dataset(datasetID).Delete(ctx);err!=nil{returnfmt.Errorf("Delete: %v",err)}returnnil}Java
The following code sample deletes an empty dataset.
Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQuery.DatasetDeleteOption;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.DatasetId;publicclassDeleteDataset{publicstaticvoidrunDeleteDataset(){// TODO(developer): Replace these variables before running the sample.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";deleteDataset(projectId,datasetName);}publicstaticvoiddeleteDataset(StringprojectId,StringdatasetName){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();DatasetIddatasetId=DatasetId.of(projectId,datasetName);booleansuccess=bigquery.delete(datasetId,DatasetDeleteOption.deleteContents());if(success){System.out.println("Dataset deleted successfully");}else{System.out.println("Dataset was not found");}}catch(BigQueryExceptione){System.out.println("Dataset was not deleted. \n"+e.toString());}}}The following code sample deletes a dataset and all of its contents:
/* * Copyright 2020 Google LLC * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */packagecom.example.bigquery;importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.DatasetId;// Sample to delete dataset with contents.publicclassDeleteDatasetAndContents{publicstaticvoidmain(String[]args){// TODO(developer): Replace these variables before running the sample.StringprojectId="MY_PROJECT_ID";StringdatasetName="MY_DATASET_NAME";deleteDatasetAndContents(projectId,datasetName);}publicstaticvoiddeleteDatasetAndContents(StringprojectId,StringdatasetName){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();DatasetIddatasetId=DatasetId.of(projectId,datasetName);// Use the force parameter to delete a dataset and its contentsbooleansuccess=bigquery.delete(datasetId,BigQuery.DatasetDeleteOption.deleteContents());if(success){System.out.println("Dataset deleted with contents successfully");}else{System.out.println("Dataset was not found");}}catch(BigQueryExceptione){System.out.println("Dataset was not deleted with contents. \n"+e.toString());}}}Node.js
Before trying this sample, follow theNode.js setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryNode.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.// Import the Google Cloud client libraryconst{BigQuery}=require('@google-cloud/bigquery');constbigquery=newBigQuery();asyncfunctiondeleteDataset(){// Deletes a dataset named "my_dataset"./** * TODO(developer): Uncomment the following lines before running the sample. */// const datasetId = 'my_dataset';// Create a reference to the existing datasetconstdataset=bigquery.dataset(datasetId);// Delete the dataset and its contentsawaitdataset.delete({force:true});console.log(`Dataset${dataset.id} deleted.`);}PHP
Before trying this sample, follow thePHP setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPHP API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.use Google\Cloud\BigQuery\BigQueryClient;/** Uncomment and populate these variables in your code */// $projectId = 'The Google project ID';// $datasetId = 'The BigQuery dataset ID';$bigQuery = new BigQueryClient([ 'projectId' => $projectId,]);$dataset = $bigQuery->dataset($datasetId);$table = $dataset->delete();printf('Deleted dataset %s' . PHP_EOL, $datasetId);Python
Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.fromgoogle.cloudimportbigquery# Construct a BigQuery client object.client=bigquery.Client()# TODO(developer): Set model_id to the ID of the model to fetch.# dataset_id = 'your-project.your_dataset'# Use the delete_contents parameter to delete a dataset and its contents.# Use the not_found_ok parameter to not receive an error if the dataset has already been deleted.client.delete_dataset(dataset_id,delete_contents=True,not_found_ok=True)# Make an API request.print("Deleted dataset '{}'.".format(dataset_id))Ruby
The following code sample deletes an empty dataset.
Before trying this sample, follow theRuby setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryRuby API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Install thePython client for the BigQuery Data Transfer API withpip install google-cloud-bigquery-datatransfer. Then create a transfer configuration to copy the dataset.The following code sample deletes a dataset and all of its contents:
# Copyright 2020 Google LLC## Licensed under the Apache License, Version 2.0 (the "License");# you may not use this file except in compliance with the License.# You may obtain a copy of the License at## http://www.apache.org/licenses/LICENSE-2.0## Unless required by applicable law or agreed to in writing, software# distributed under the License is distributed on an "AS IS" BASIS,# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.# See the License for the specific language governing permissions and# limitations under the License.require"google/cloud/bigquery"defdelete_dataset_and_contentsdataset_id="my_dataset_with_tables"bigquery=Google::Cloud::Bigquery.new# Use the force parameter to delete a dataset and its contentsdataset=bigquery.datasetdataset_iddataset.deleteforce:trueputs"Dataset#{dataset_id} and contents deleted."endRestore tables from deleted datasets
You can restore tables from a deleted dataset that are within the dataset'stime travel window. To restore the entire dataset,seeRestore deleted datasets.
- Create a dataset with the same name and in the same location as the original.
- Choose a timestamp from before the original dataset was deleted by using aformat of milliseconds since the epoch–for example,
1418864998000. Copy the
originaldataset.table1table at the time1418864998000intothe new dataset:bq cp originaldataset.table1@1418864998000 mydataset.mytable
To find the names of the nonempty tables that were in the deleted dataset,query the dataset's
INFORMATION_SCHEMA.TABLE_STORAGEviewwithin the time travel window.
Restore deleted datasets
To learn how to restore (orundelete) a deleted dataset, seeRestore deleted datasets.
Quotas
For information about copy quotas, seeCopy jobs.Usage for copy jobs are available in theINFORMATION_SCHEMA. To learn how toquery theINFORMATION_SCHEMA.JOBS view, seeJOBS view.
Pricing
For pricing information for copying datasets, seeData replication pricing.
BigQuery sends compressed data for copying across regions so thedata that is billed might be less than the actual size of your dataset.For more information, seeBigQuery pricing.
What's next
- Learn how tocreate datasets.
- Learn how toupdate datasets.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.