Update dataset properties
This document describes how to update dataset properties inBigQuery. After you create a dataset, you can update the followingdataset properties:
- Billing model
- Defaultexpiration time for new tables
- Defaultpartition expiration for new partitionedtables
- Defaultrounding mode for new tables
- Description
- Labels
- Time travel windows
Before you begin
Grant Identity and Access Management (IAM) roles that give users the necessary permissions to perform each task in this document.
Required permissions
To update dataset properties, you need the following IAM permissions:
bigquery.datasets.updatebigquery.datasets.setIamPolicy(only required when updating dataset accesscontrols in the Google Cloud console)
Theroles/bigquery.dataOwner predefined IAM role includes thepermissions that you need to update dataset properties.
Additionally, if you have thebigquery.datasets.create permission, you canupdate properties of the datasets that you create.
For more information on IAM roles and permissions inBigQuery, seePredefined roles and permissions.
Update dataset descriptions
You can update a dataset's description in the following ways:
- Using the Google Cloud console.
- Using the bq command-line tool's
bq updatecommand. - Calling the
datasets.patchAPI method. - Using the client libraries.
To update a dataset's description:
Console
In the left pane, clickExplorer:

If you don't see the left pane, clickExpand left pane to open the pane.
In theExplorer pane, expand your project, clickDatasets, andthen click a dataset.
In theDetails pane, clickEdit details to edit the description text.
In theEdit detail dialog that appears, do the following:
- In theDescription field, enter a description oredit the existing description.
- To save the new description text, clickSave.
SQL
To update a dataset's description, use theALTER SCHEMA SET OPTIONS statementto set thedescription option.
The following example sets the description on a dataset namedmydataset:
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
ALTERSCHEMAmydatasetSETOPTIONS(description='Description of mydataset');
ClickRun.
For more information about how to run queries, seeRun an interactive query.
bq
Issue thebq update command with the--description flag. If you areupdating a dataset in a project other than your default project, add theproject ID to the dataset name in the following format:project_id:dataset.
bqupdate\--description"string"\project_id:dataset
Replace the following:
string: the text that describes the dataset,in quotesproject_id: your project IDdataset: the name of the dataset that you'reupdating
Examples:
Enter the following command to change the description ofmydataset to"Description of mydataset."mydataset is in your default project.
bq update --description "Description of mydataset" mydatasetEnter the following command to change the description ofmydataset to"Description of mydataset." The dataset is inmyotherproject, not yourdefault project.
bq update \--description "Description of mydataset" \myotherproject:mydatasetAPI
Calldatasets.patch andupdate thedescription property in thedataset resource.Because thedatasets.update method replaces the entire dataset resource,thedatasets.patch method is preferred.
Go
Before trying this sample, follow theGo setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryGo API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.import("context""fmt""cloud.google.com/go/bigquery")// updateDatasetDescription demonstrates how the Description metadata of a dataset can// be read and modified.funcupdateDatasetDescription(projectID,datasetIDstring)error{// projectID := "my-project-id"// datasetID := "mydataset"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %v",err)}deferclient.Close()ds:=client.Dataset(datasetID)meta,err:=ds.Metadata(ctx)iferr!=nil{returnerr}update:=bigquery.DatasetMetadataToUpdate{Description:"Updated Description.",}if_,err=ds.Update(ctx,update,meta.ETag);err!=nil{returnerr}returnnil}
Java
Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.Dataset;publicclassUpdateDatasetDescription{publicstaticvoidrunUpdateDatasetDescription(){// TODO(developer): Replace these variables before running the sample.StringdatasetName="MY_DATASET_NAME";StringnewDescription="this is the new dataset description";updateDatasetDescription(datasetName,newDescription);}publicstaticvoidupdateDatasetDescription(StringdatasetName,StringnewDescription){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();Datasetdataset=bigquery.getDataset(datasetName);bigquery.update(dataset.toBuilder().setDescription(newDescription).build());System.out.println("Dataset description updated successfully to "+newDescription);}catch(BigQueryExceptione){System.out.println("Dataset description was not updated \n"+e.toString());}}}
Node.js
Before trying this sample, follow theNode.js setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryNode.js API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.// Import the Google Cloud client libraryconst{BigQuery}=require('@google-cloud/bigquery');constbigquery=newBigQuery();asyncfunctionupdateDatasetDescription(){// Updates a dataset's description.// Retreive current dataset metadataconstdataset=bigquery.dataset(datasetId);const[metadata]=awaitdataset.getMetadata();// Set new dataset descriptionconstdescription='New dataset description.';metadata.description=description;const[apiResponse]=awaitdataset.setMetadata(metadata);constnewDescription=apiResponse.description;console.log(`${datasetId} description:${newDescription}`);}
Python
Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.fromgoogle.cloudimportbigquery# Construct a BigQuery client object.client=bigquery.Client()# TODO(developer): Set dataset_id to the ID of the dataset to fetch.# dataset_id = 'your-project.your_dataset'dataset=client.get_dataset(dataset_id)# Make an API request.dataset.description="Updated description."dataset=client.update_dataset(dataset,["description"])# Make an API request.full_dataset_id="{}.{}".format(dataset.project,dataset.dataset_id)print("Updated dataset '{}' with description '{}'.".format(full_dataset_id,dataset.description))
Update default table expiration times
You can update a dataset's default table expiration time in the following ways:
- Using the Google Cloud console.
- Using the bq command-line tool's
bq updatecommand. - Calling the
datasets.patchAPI method. - Using the client libraries.
You can set a default table expiration time at the dataset level, or you can seta table's expiration time when the table is created. If you set the expirationwhen the table is created, the dataset's default table expiration is ignored. Ifyou don't set a default table expiration at the dataset level, and you don'tset a table expiration when the table is created, the table never expires andyou mustdelete the tablemanually. When a table expires, it's deleted along with all of the data itcontains.
When you update a dataset's default table expiration setting:
- If you change the value from
Neverto a defined expiration time, any tablesthat already exist in the dataset won't expire unless the expiration time wasset on the table when it was created. - If you are changing the value for the default table expiration, any tablesthat already exist expire according to the original table expiration setting.Any new tables created in the dataset have the new table expiration settingapplied unless you specify a different table expiration on the table when it iscreated.
The value for default table expiration is expressed differently dependingon where the value is set. Use the method that gives you the appropriatelevel of granularity:
- In the Google Cloud console, expiration is expressed in days.
- In the bq command-line tool, expiration is expressed in seconds.
- In the API, expiration is expressed in milliseconds.
To update the default expiration time for a dataset:
Console
In the left pane, clickExplorer:

In theExplorer pane, expand your project, clickDatasets, andthen click a dataset.
In theDetails tab, clickEdit detailsto edit the expiration time.
In theEdit detail dialog, in theDefault table expirationsection, selectEnable table expiration and enter a value forDefault maximum table age.
ClickSave.
SQL
To update the default table expiration time, use theALTER SCHEMA SET OPTIONS statementto set thedefault_table_expiration_days option.
The following example updates the default table expiration for a datasetnamedmydataset.
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
ALTERSCHEMAmydatasetSETOPTIONS(default_table_expiration_days=3.75);
ClickRun.
For more information about how to run queries, seeRun an interactive query.
bq
To update the default expiration time for newly created tables in a dataset,enter thebq update command with the--default_table_expiration flag.If you are updating a dataset in a project other than your default project,add the project ID to the dataset name in the following format:project_id:dataset.
bqupdate\--default_table_expirationinteger\project_id:dataset
Replace the following:
integer: the default lifetime, in seconds, fornewly created tables. The minimum value is 3600 seconds (one hour). Theexpiration time evaluates to the current UTC time plus the integer value.Specify0to remove the existing expiration time. Any table created inthe dataset is deletedintegerseconds afterits creation time. This value is applied if you do not set a tableexpiration when the table iscreated.project_id: your project ID.dataset: the name of the dataset that you'reupdating.
Examples:
Enter the following command to set the default table expiration fornew tables created inmydataset to two hours (7200 seconds) from thecurrent time. The dataset is in your default project.
bq update --default_table_expiration 7200 mydatasetEnter the following command to set the default table expiration fornew tables created inmydataset to two hours (7200 seconds) from thecurrent time. The dataset is inmyotherproject, not your default project.
bq update --default_table_expiration 7200 myotherproject:mydatasetAPI
Calldatasets.patch andupdate thedefaultTableExpirationMs property in thedataset resource.The expiration is expressed in milliseconds in the API. Because thedatasets.update method replaces the entire dataset resource, thedatasets.patch method is preferred.
Go
Before trying this sample, follow theGo setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryGo API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.import("context""fmt""time""cloud.google.com/go/bigquery")// updateDatasetDefaultExpiration demonstrats setting the default expiration of a dataset// to a specific retention period.funcupdateDatasetDefaultExpiration(projectID,datasetIDstring)error{// projectID := "my-project-id"// datasetID := "mydataset"ctx:=context.Background()client,err:=bigquery.NewClient(ctx,projectID)iferr!=nil{returnfmt.Errorf("bigquery.NewClient: %v",err)}deferclient.Close()ds:=client.Dataset(datasetID)meta,err:=ds.Metadata(ctx)iferr!=nil{returnerr}update:=bigquery.DatasetMetadataToUpdate{DefaultTableExpiration:24*time.Hour,}if_,err:=client.Dataset(datasetID).Update(ctx,update,meta.ETag);err!=nil{returnerr}returnnil}
Java
Before trying this sample, follow theJava setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryJava API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
Configure the default expiration time with theDataset.Builder.setDefaultTableLifetime()method.
importcom.google.cloud.bigquery.BigQuery;importcom.google.cloud.bigquery.BigQueryException;importcom.google.cloud.bigquery.BigQueryOptions;importcom.google.cloud.bigquery.Dataset;importjava.util.concurrent.TimeUnit;publicclassUpdateDatasetExpiration{publicstaticvoidrunUpdateDatasetExpiration(){// TODO(developer): Replace these variables before running the sample.StringdatasetName="MY_DATASET_NAME";updateDatasetExpiration(datasetName);}publicstaticvoidupdateDatasetExpiration(StringdatasetName){try{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests.BigQuerybigquery=BigQueryOptions.getDefaultInstance().getService();// Update dataset expiration to one dayLongnewExpiration=TimeUnit.MILLISECONDS.convert(1,TimeUnit.DAYS);Datasetdataset=bigquery.getDataset(datasetName);bigquery.update(dataset.toBuilder().setDefaultTableLifetime(newExpiration).build());System.out.println("Dataset description updated successfully to "+newExpiration);}catch(BigQueryExceptione){System.out.println("Dataset expiration was not updated \n"+e.toString());}}}Node.js
Before trying this sample, follow theNode.js setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryNode.js API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.// Import the Google Cloud client libraryconst{BigQuery}=require('@google-cloud/bigquery');constbigquery=newBigQuery();asyncfunctionupdateDatasetExpiration(){// Updates the lifetime of all tables in the dataset, in milliseconds./** * TODO(developer): Uncomment the following lines before running the sample. */// const datasetId = "my_dataset";// Retreive current dataset metadataconstdataset=bigquery.dataset(datasetId);const[metadata]=awaitdataset.getMetadata();// Set new dataset metadataconstexpirationTime=24*60*60*1000;metadata.defaultTableExpirationMs=expirationTime.toString();const[apiResponse]=awaitdataset.setMetadata(metadata);constnewExpirationTime=apiResponse.defaultTableExpirationMs;console.log(`${datasetId} expiration:${newExpirationTime}`);}
Python
Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation. To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.fromgoogle.cloudimportbigquery# Construct a BigQuery client object.client=bigquery.Client()# TODO(developer): Set dataset_id to the ID of the dataset to fetch.# dataset_id = 'your-project.your_dataset'dataset=client.get_dataset(dataset_id)# Make an API request.dataset.default_table_expiration_ms=24*60*60*1000# In milliseconds.dataset=client.update_dataset(dataset,["default_table_expiration_ms"])# Make an API request.full_dataset_id="{}.{}".format(dataset.project,dataset.dataset_id)print("Updated dataset{} with new expiration{}".format(full_dataset_id,dataset.default_table_expiration_ms))
Update default partition expiration times
You can update a dataset's default partition expiration in the following ways:
- Using the bq command-line tool's
bq updatecommand. - Calling the
datasets.patchAPI method. - Using the client libraries.
Setting or updating a dataset's default partition expiration isn'tsupported by the Google Cloud console.
You can set a default partition expiration time at the dataset level thataffects all newly created partitioned tables, or you can set apartition expirationtime for individual tables when the partitioned tables are created. If you setthe default partition expiration at the dataset level, and you set the defaulttable expiration at the dataset level, new partitioned tables will only have apartition expiration. If both options are set, the default partition expirationoverrides the default table expiration.
If you set the partition expiration time when the partitioned table is created,that value overrides the dataset-level default partition expiration if itexists.
If you do not set a default partition expiration at the dataset level, and youdo not set a partition expiration when the table is created, thepartitions never expire and you mustdelete the partitionsmanually.
When you set a default partition expiration on a dataset, the expiration appliesto all partitions in all partitioned tables created in the dataset. When you setthe partition expiration on a table, the expiration applies to allpartitions created in the specified table. You cannot apply differentexpiration times to different partitions in the same table.
When you update a dataset's default partition expiration setting:
- If you change the value from
neverto a defined expiration time, anypartitions that already exist in partitioned tables in the dataset will notexpire unless the partition expiration time was set on the table when it wascreated. - If you are changing the value for the default partition expiration, anypartitions in existing partitioned tables expire according to the originaldefault partition expiration. Any new partitioned tables created in the datasethave the new default partition expiration setting applied unless you specify adifferent partition expiration on the table when it is created.
The value for default partition expiration is expressed differently dependingon where the value is set. Use the method that gives you the appropriatelevel of granularity:
- In the bq command-line tool, expiration is expressed in seconds.
- In the API, expiration is expressed in milliseconds.
To update the default partition expiration time for a dataset:
Console
Updating a dataset's default partition expiration is not supportedby the Google Cloud console.
SQL
To update the default partition expiration time, use theALTER SCHEMA SET OPTIONS statementto set thedefault_partition_expiration_days option.
The following example updates the default partition expiration for adataset namedmydataset:
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
ALTERSCHEMAmydatasetSETOPTIONS(default_partition_expiration_days=3.75);
ClickRun.
For more information about how to run queries, seeRun an interactive query.
bq
To update the default expiration time for a dataset, enter thebq updatecommand with the--default_partition_expiration flag. If you are updatinga dataset in a project other than your default project,add the project ID to the dataset name in the following format:project_id:dataset.
bqupdate\--default_partition_expirationinteger\project_id:dataset
Replace the following:
integer: the default lifetime, in seconds, forpartitions in newly created partitioned tables. This flag has no minimumvalue. Specify0to remove the existing expiration time. Any partitions innewly created partitioned tables are deletedintegerseconds after the partition's UTC date. Thisvalue is applied if you do not set a partition expiration on the table whenit is created.project_id: your project ID.dataset: the name of the dataset that you'reupdating.
Examples:
Enter the following command to set the default partition expiration fornew partitioned tables created inmydataset to 26 hours (93,600 seconds).The dataset is in your default project.
bq update --default_partition_expiration 93600 mydatasetEnter the following command to set the default partition expiration fornew partitioned tables created inmydataset to 26 hours (93,600 seconds).The dataset is inmyotherproject, not your default project.
bq update --default_partition_expiration 93600 myotherproject:mydatasetAPI
Calldatasets.patch andupdate thedefaultPartitionExpirationMs property in thedataset resource.The expiration is expressed in milliseconds. Because thedatasets.updatemethod replaces the entire dataset resource, thedatasets.patch method ispreferred.
Update rounding mode
You can update a dataset's defaultrounding modeby using theALTER SCHEMA SET OPTIONS DDL statement.The following example updates the default rounding mode formydataset toROUND_HALF_EVEN.
ALTERSCHEMAmydatasetSETOPTIONS(default_rounding_mode="ROUND_HALF_EVEN");
This sets the default rounding mode for new tables created in the dataset. Ithas no impact on new columns added to existing tables.Setting the default rounding mode on a table in the dataset overrides thisoption.
Update time travel windows
You can update a dataset's time travel window in the following ways:
- Using the Google Cloud console.
- Using the
ALTER SCHEMA SET OPTIONSstatement. - Using the bq command-line tool's
bq updatecommand. - Calling the
datasets.patchordatasets.updateAPImethod. Theupdatemethod replaces the entire dataset resource, whereas thepatchmethod only replaces fields that are provided in the submitted datasetresource.
For more information on the time travel window, seeConfigure the time travel window.
To update the time travel window for a dataset:
Console
In the left pane, clickExplorer:

In theExplorer pane, expand your project, clickDatasets, andthen click a dataset.
In theDetails tab, clickEdit details.
ExpandAdvanced options, then select theTime travel windowto use.
ClickSave.
SQL
Use theALTER SCHEMA SET OPTIONSstatement with themax_time_travel_hours option to specify the time travelwindow when altering a dataset. Themax_time_travel_hours value mustbe an integer expressed in multiples of 24 (48, 72, 96, 120, 144, 168)between 48 (2 days) and 168 (7 days).
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
ALTERSCHEMADATASET_NAMESETOPTIONS(max_time_travel_hours=HOURS);
Replace the following:
DATASET_NAME: the name of the dataset thatyou're updatingHOURSwith the time travel window's duration in hours.
ClickRun.
For more information about how to run queries, seeRun an interactive query.
bq
Use thebq updatecommand with the--max_time_travel_hours flag to specify the time travelwindow when altering a dataset. The--max_time_travel_hours value mustbe an integer expressed in multiples of 24 (48, 72, 96, 120, 144, 168)between 48 (2 days) and 168 (7 days).
bq update \--dataset=true --max_time_travel_hours=HOURS \PROJECT_ID:DATASET_NAMEReplace the following:
PROJECT_ID: your project IDDATASET_NAME: the name of the dataset that you're updatingHOURSwith the time travel window's durationin hours
API
Call thedatasets.patch ordatasets.updatemethod with a defineddataset resource in which youhave specified a value for themaxTimeTravelHours field. ThemaxTimeTravelHours value must be an integer expressed in multiples of 24(48, 72, 96, 120, 144, 168) between 48 (2 days) and 168 (7 days).
Update storage billing models
You can alter thestorage billing modelfor a dataset. Set thestorage_billing_model value toPHYSICAL to usephysical bytes when calculating storage changes, or toLOGICAL to uselogical bytes.LOGICAL is the default.
When you change a dataset's billing model, it takes 24 hours for thechange to take effect.
Once you change a dataset's storage billing model, you must wait 14 daysbefore you can change the storage billing model again.
Console
In the left pane, clickExplorer:

In theExplorer pane, expand your project, clickDatasets, andthen click a dataset.
In theDetails tab, clickEdit details.
ExpandAdvanced options.
In theStorage billing model menu, selectPhysical to usephysical storage billing, or selectLogical to use logical storagebilling. You can also selectStorage_billing_model_unspecified.
ClickSave.
SQL
To update the billing model for a dataset, use theALTER SCHEMA SET OPTIONS statementand set thestorage_billing_model option:
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
ALTERSCHEMADATASET_NAMESETOPTIONS(storage_billing_model='BILLING_MODEL');
Replace the following:
DATASET_NAMEwith the name of the datasetthat you are changingBILLING_MODELwith the type of storage you wantto use, eitherLOGICALorPHYSICAL
ClickRun.
For more information about how to run queries, seeRun an interactive query.
To update the storage billing model for all datasets in a project, usethe following SQL query for every region, where datasets are located:
FORrecordIN(SELECTCONCAT(catalog_name,'.',schema_name)ASdataset_pathFROMPROJECT_ID.region-REGION.INFORMATION_SCHEMA.SCHEMATA)DOEXECUTEIMMEDIATE"ALTER SCHEMA `"||record.dataset_path||"` SET OPTIONS(storage_billing_model = 'BILLING_MODEL')";ENDFOR;
Replace the following:
PROJECT_IDwith your project IDREGIONwith aregion qualifierBILLING_MODELwith the type of storage you wantto use, eitherLOGICALorPHYSICAL
bq
To update the billing model for a dataset, use thebq update commandand set the--storage_billing_model flag:
bqupdate-d--storage_billing_model=BILLING_MODELPROJECT_ID:DATASET_NAMEReplace the following:
PROJECT_ID: your project IDDATASET_NAME: the name of the dataset that you're updatingBILLING_MODEL: the type of storage you wantto use, eitherLOGICALorPHYSICAL
API
Call thedatasets.update methodwith a defineddataset resourcewhere thestorageBillingModel field is set.
The following example shows how to calldatasets.update usingcurl:
curl-H"Authorization: Bearer$(gcloudauthprint-access-token)"-H"Content-Type: application/json"-L-XPUThttps://bigquery.googleapis.com/bigquery/v2/projects/PROJECT_ID/datasets/DATASET_ID-d'{"datasetReference": {"projectId": "PROJECT_ID", "datasetId": "DATASET_NAME"}, "storageBillingModel": "BILLING_MODEL"}'
Replace the following:
PROJECT_ID: your project IDDATASET_NAME: the name of the dataset that you're updatingBILLING_MODEL: the type of storage you wantto use, eitherLOGICALorPHYSICAL
Update access controls
To control access to datasets in BigQuery, seeControlling access to datasets.For information about data encryption, seeEncryption at rest.
What's next
- For more information about creating datasets, seeCreating datasets.
- For more information about managing datasets, seeManaging datasets.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-18 UTC.