Configure log buckets

This document describes how to create and manage Cloud Logging buckets usingthe Google Cloud console, theGoogle Cloud CLI, and theLogging API.It also provides instructions for creating and managing log buckets at theGoogle Cloud project level. You can't create log buckets at the folderor organization level; however, Cloud Logging automatically creates_Default and_Required log buckets at the folder and organization level foryou.

You can upgrade log buckets to useLog Analytics.Log Analytics lets you run SQL queries on your log data, helping youtroubleshoot application, security, and networking issues.

To use BigQuery to analyze your log data, you have two choices:

  • Upgrade a log bucket to use Log Analytics and then create alinked BigQuery dataset. In this scenario,Logging stores your log data butBigQuery can read the log data.

  • Export your log entries to BigQuery. In this scenario, youmust create a sink, BigQuery stores and manages the data,and you have the option to use partitioned tables.

When your log data is available to BigQuery, you can join yourlog data with other data stored in BigQuery, and you can accessthis data from other tools like Looker Studio and Looker.

For a conceptual overview of buckets, seeRouting and storage overview: Log buckets.

This document doesn't describe how to create a log bucket that uses acustomer-managed encryption key (CMEK). If you are interested in that topic,then seeConfigure CMEK for logs storage.

Before you begin

To get started with buckets, do the following:

  • Configure your Google Cloud project:
    1. Verify that billing is enabled for your Google Cloud project.

    2. To get the permissions that you need to create, upgrade, and link a log bucket, ask your administrator to grant you theLogs Configuration Writer (roles/logging.configWriter) IAM role on your project. For more information about granting roles, seeManage access to projects, folders, and organizations.

      You might also be able to get the required permissions throughcustom roles or otherpredefined roles.

      For the full list of permissions and roles, seeAccess control with IAM.

    3. Optional: To use BigQuery to view the data stored in a log bucket, do the following:
      1. Make sure that the BigQuery API is enabled. You can verify that the API is enabled bylisting available services.
      2. Your Identity and Access Management role includes the permissions that let you create alinked dataset. For more information, seePermissions for linked BigQuery datasets.
  • Understand thesupported regions in which you can store your logs.
  • Select the tab for how you plan to use the samples on this page:

    Console

    When you use the Google Cloud console to access Google Cloud services and APIs, you don't need to set up authentication.

    gcloud

    In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

    At the bottom of the Google Cloud console, aCloud Shell session starts and displays a command-line prompt. Cloud Shell is a shell environment with the Google Cloud CLI already installed and with values already set for your current project. It can take a few seconds for the session to initialize.

    REST

    To use the REST API samples on this page in a local development environment, you use the credentials you provide to the gcloud CLI.

      Install the Google Cloud CLI. After installation,initialize the Google Cloud CLI by running the following command:

      gcloudinit

      If you're using an external identity provider (IdP), you must first sign in to the gcloud CLI with your federated identity.

    For more information, seeAuthenticate for using REST in the Google Cloud authentication documentation.

  • If you plan to use the Google Cloud CLI or Cloud Logging API to create or manage your log buckets, then understand theLogBucket formatting requirements.

Create a bucket

You can create a maximum of 100 buckets perGoogle Cloud project.

To create a user-defined log bucket for your Google Cloud project, do thefollowing:

Google Cloud console

To create a log bucket in your Google Cloud project, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. ClickCreate log bucket.

  3. Enter aName andDescription for your bucket.

  4. Optional: Upgrade your bucket to use Log Analytics.

    Note: After a log bucket has been upgraded to use Log Analytics, youcan't reconfigure the log bucket to change or remove the use ofLog Analytics.

    1. SelectUpgrade to use Log Analytics.

      When you upgrade a bucket to use Log Analytics, you can queryyour logs in theLog Analytics page by using SQL queries. Youcan also continue to view your logs by using the Logs Explorer.

    2. Optional: To view your logs in BigQuery, selectCreate a new BigQuery dataset that links to this bucket and enter a unique dataset name.

      When you select this option, BigQuery can read thedata stored in your log bucket. You can now query in theBigQuery interface where you can join your log data,and also access data from other tools like Looker Studioand Looker.

  5. To select the storageregionfor your logs, click theSelect log bucket region menu and select aregion.

    Note: Log buckets in theglobal region or in multi-regions don't offeradded resiliency compared to log buckets created in single regions likeus-west1.

    Note: After you create your log bucket, you can't change your bucket's region. If you need a bucket in a different region, you must create a new bucket in that region, redirect the appropriate sinks to the new bucket, and then delete the old bucket.

  6. Optional: To set acustom retention period for thelogs in the bucket, clickNext.

    In theRetention period field, enter the number of days, between1 day and3650 days, that you want Cloud Logging toretain your logs. If you don't customize the retention period, the defaultis30 days.

    You can also update your bucket to apply custom retention after you createit.

  7. ClickCreate bucket.

    After the log bucket is created, Logging upgrades thebucket and creates the dataset link, if these options were selected.

    It might take a moment for these steps to complete.

gcloud

Note: You can't create a linked BigQuery dataset when you createthe log bucket unless you use the Google Cloud console. However,if you create and upgrade a log bucket, then you can use thegcloud logging links create commandmethod to create a linked dataset.

To only create a log bucket,run thegcloud logging buckets create command. If you wantto upgrade the log bucket to use Log Analytics, then includethe--enable-analytics and--async flags,and make sure that you set the variableLOCATION to asupported region:

gcloud logging buckets createBUCKET_ID --location=LOCATION --enable-analytics --asyncOPTIONAL_FLAGS

The flag--async forces thecommand to be asynchronous. The return ofan asynchronous method is anOperation object, itcontains information about the progress of the method. When themethod completes, theOperation object contains the status. For moreinformation, seeAsynchronous API methods.

If you don't want to upgrade the log bucket to use Log Analytics, thenomit the--enable-analytics and--async flags.

For example, if you want to create a bucket with theBUCKET_IDmy-bucket in theglobal region, your command would look like thefollowing:

gcloud logging buckets create my-bucket --location global --description "My first bucket"

For example, to create a bucket with theBUCKET_IDmy-upgraded-bucket in theglobal location,and then upgrade the log bucket to use Log Analytics,your command would look like the following:

gcloud logging buckets create my-upgraded-bucket --location global \      --description "My first upgraded bucket" \      --enable-analytics --retention-days=45

Note: After you create your log bucket, you can't change your bucket's region. If you need a bucket in a different region, you must create a new bucket in that region, redirect the appropriate sinks to the new bucket, and then delete the old bucket.

REST

Note: You can't create a linked BigQuery dataset aspart of the create operation unless you use the Google Cloud console. However,if you create and upgrade a log bucket, then you can use theprojects.locations.buckets.links.createmethod to create a linked dataset.

To create a bucket, use theprojects.locations.buckets.createor theprojects.locations.buckets.createAsyncmethod. Prepare the arguments to the method as follows:

  1. Set theparent parameter to be the resource in whichto create the bucket:projects/PROJECT_ID/locations/LOCATION

    The variableLOCATION refers to theregion in which youwant your logs to be stored.

    For example, if you want to create a bucket for projectmy-project inthe in theglobal region, yourparent parameter would look likethis:projects/my-project/locations/global

  2. Set thebucketId parameter; for example,my-bucket.

  3. Do one of the following:

After creating a bucket, create a sink to route log entries to your bucket andconfigure log viewsto control who can access the logs in your new bucket and which logs areaccessible to them.You can also update the bucket to configurecustom retention andrestricted fields.

Track volume of logs stored in log buckets

TheLogs Storage page in the Google Cloud console tracks the volume oflogs data stored in log buckets:

In the Google Cloud console, go to theLogs Storage page:

Go toLogs Storage

If you use the search bar to find this page, then select the result whose subheading isLogging.

TheLogs Storage page displays a summary of statistics for yourGoogle Cloud project:

The summary statistics report the amount of log data stored in log bucketsfor the selected project.

The following statistics are reported:

  • Current month ingestion: The amount of logs data that yourGoogle Cloud project has stored in log buckets since the first dayof the current calendar month.

  • Previous month ingestion: The amount of logs data that yourGoogle Cloud project stored in log buckets in the last calendar month.

  • Projected ingestion by EOM: The estimated amount of logs data that yourGoogle Cloud project will store in log buckets by the end of thecurrent calendar month, based on current usage.

  • Current month billable storage: The amount of logs data that has beenretained for over 30 days that is billed.

The previous statistics don't include logs in the_Required bucket. Thelogs in that bucket can't be excluded or disabled.

TheLog Router page in the Google Cloud console gives you tools that youcan use to minimize any charges for storing logs in log buckets orfor storage that exceeds your monthly allotment. You can do the following:

  • Disable logs from being stored at the bucket level.
  • Exclude certain log entries from being stored in log buckets.

For more information, seeManage sinks.

Manage buckets

This section describes how to manage your log buckets using the Google Cloud CLIor the Google Cloud console.

Update a bucket

To update the properties of your bucket, such as thedescription or retention period, do the following:

Google Cloud console

To update your bucket's properties, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. For the bucket you want to update, clickMore.

  3. SelectEdit bucket.

  4. Edit your bucket as needed.

  5. ClickUpdate bucket.

gcloud

To update your bucket's properties, run thegcloud logging buckets update command:

gcloud logging buckets updateBUCKET_ID --location=LOCATIONUPDATED_ATTRIBUTES

For example:

gcloud logging buckets update my-bucket --location=global --description "Updated description"

REST

To update your bucket's properties, useprojects.locations.buckets.patchin the Logging API.

Upgrade a bucket to use Log Analytics

After you upgrade a bucket to use Log Analytics, any new log entries thatarrive are available to analyze in the Log Analytics interface.Cloud Logging also initiates a backfill operation, which lets you analyzeolder log entries written before the upgrade. The backfill process might takeseveral hours. You can't undo an upgrade operation on a bucket.

To upgrade an existing bucket to use Log Analytics, the followingrestrictions apply:

  • The log bucket was created at the Google Cloud project level.
  • The log bucket isunlocked unless it is the_Required bucket.
  • There aren't pending updates to the bucket.

Google Cloud console

To upgrade an existing bucket to use Log Analytics, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. Locate the bucket that you want to upgrade.

  3. When theLog Analytics available column displaysUpgrade,you can upgrade the log bucket to use Log Analytics.ClickUpgrade.

    A dialog opens. ClickConfirm.

gcloud

To upgrade your log bucket to use Log Analytics, run thegcloud logging buckets update command. You mustset the--enable-analyticsflag, and we recommend that you also include the--async flag:

gcloud logging buckets updateBUCKET_ID --location=LOCATION --enable-analytics --async

The flag--async forces thecommand to be asynchronous. The return of an asynchronousmethod is anOperation object, and itcontains information about the progress of the method. When themethod completes, theOperation object contains the status. For moreinformation, seeAsynchronous API methods.

Note: Don't update any other property of a log bucket when you upgradethe log bucket to use Log Analytics. For example, a call to theupdate command that changes the retention period and upgrades thelog bucket fails.

REST

To upgrade a log bucket to use Log Analytics, use theprojects.locations.buckets.updateAsyncmethod of the Cloud Logging API.

Prepare the arguments to the method as follows:

  1. Set theLogBucket.analyticsEnabled boolean totrue.
  2. For the query parameter of the command, useupdateMask=analyticsEnabled.

The response to the asynchronous methods is anOperation object. This object containsinformation about the progress of the method. When the methodcompletes, theOperation object contains the status. For more information,seeAsynchronous API methods.

TheupdateAsync might take several minutes to complete.

Create a linked BigQuery dataset

When you want to use the capabilities of BigQuery to analyze yourlog data, upgrade a log bucket to use Log Analytics, and then create alinked dataset.With this configuration,Logging stores your log data butBigQuery can read the log data.

Note: A log bucket can be linked to at most one BigQuery dataset.

Google Cloud console

To create a link to a BigQuery dataset for an existinglog bucket, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. Locate the log bucket and verify that theLog Analytics availablecolumn displaysOpen.

    If this column displaysUpgrade, then the log bucket hasn't beenupgraded to use Log Analytics. Configure Log Analytics:

    1. ClickUpgrade.
    2. ClickConfirm in the dialog.

    After the upgrade completes, proceed to the next step.

  3. On the log bucket, clickMore,and then clickEdit bucket.

    TheEdit log bucket dialog opens.

  4. SelectCreate a new BigQuery dataset that links to this bucket and enter the name for the new dataset.

    The dataset name must be unique for each Google Cloud project. If you enterthe name of an existing dataset, then you receive the following error:Dataset name must be unique in the selected region.

  5. ClickDone and then clickUpdate bucket.

    After Logging displays the linked dataset name on theLogs Storage page, it might take several minutes beforeBigQuery recognizes the dataset.

gcloud

To create a linked dataset for a log bucket that is upgradedto use Log Analytics, run thegcloud logging links create command:

gcloud logging links createLINK_ID --bucket=BUCKET_ID --location=LOCATION

TheLINK_ID that you provide is used as thename of the BigQuery dataset, and the value of this field mustbe unique for your Google Cloud project.

Thelinks create command is asynchronous. The return of anasynchronous method is anOperation object, and itcontains information about the progress of the method. When themethod completes, theOperation object contains the status. For moreinformation, seeAsynchronous API methods.

Thelinks create command takes several minutes to complete.

For example, the following command creates a linked dataset namedmylink for the log bucket namedmy-bucket:

gcloud logging links create mylink --bucket=my-bucket --location=global

The dataset name must be unique for each Google Cloud project. If you attemptto create a dataset with the same name as an existing dataset, then youreceive the following error:

BigQuery dataset with name "LINK_ID" already exists.

If you attempt to create a linked dataset for a log bucket that isn'tupgraded to use Log Analytics, then the following error is reported:

A link can only be created for an analytics-enabled bucket.

REST

To create a linked a BigQuery dataset for an existing log bucketthat is upgraded use Log Analytics, call the asynchronousprojects.locations.buckets.links.createmethod of the Cloud Logging API.

Prepare the arguments to the method as follows:

  1. Construct the request body for thecreate command. The request bodyis formatted as aLink object.
  2. For the query parameter of the command, uselinkId=LINK_ID. TheLINK_ID that you provide is used as thename of the BigQuery dataset, and the value of this field mustbe unique for your Google Cloud project..

The response to the asynchronous methods is anOperation object. This object containsinformation about the progress of the method. When themethod completes, theOperation object contains the status. For moreinformation, seeAsynchronous API methods.

Thelinks.create method takes several minutes to complete.

The dataset name must be unique for each Google Cloud project. If you attemptto create a dataset with the same name as an existing dataset, then youreceive the following error:

BigQuery dataset with name "LINK_ID" already exists.

If you attempt to create a linked dataset for a log bucket that isn'tupgraded to use Log Analytics, then the following error is reported:

A link can only be created for an analytics-enabled bucket.

Lock a bucket

Warning: Locking a log bucket is irreversible.

When you lock a bucket against updates, you also lock the bucket'sretention policy. After a retention policy is locked, you can't delete thebucket until every log entry in the bucket has fulfilled the bucket's retentionperiod. If you want to prevent the accidental deletion of a project thatcontains a locked log bucket, then add a lien to the project.To learn more, seeProtecting projects with liens.

To prevent anyone from updating or deleting a log bucket, lock the bucket. Tolock the bucket, do the following:

Google Cloud console

The Google Cloud console doesn't support locking a log bucket.

gcloud

To lock your bucket, run thegcloud logging buckets updatecommand with the--locked flag:

gcloud logging buckets updateBUCKET_ID --location=LOCATION --locked

For example:

gcloud logging buckets update my-bucket --location=global --locked

REST

To lock your bucket's attributes, useprojects.locations.buckets.patchin the Logging API. Set thelocked parameter totrue.

List buckets

To list the log buckets associated with a Google Cloud project, and to seedetails such as retention settings, do the following:

Google Cloud console

In the Google Cloud console, go to theLogs Storage page:

Go toLogs Storage

If you use the search bar to find this page, then select the result whose subheading isLogging.

A table namedLog buckets lists the buckets associated with the currentGoogle Cloud project.

The table lists the following attributes for each log bucket:

  • Name: The name of the log bucket.
  • Description: The description of the bucket.
  • Retention period: The number of days that the bucket's data willbe stored by Cloud Logging.
  • Region: The geographic location in which the bucket's data is stored.
  • Status: Whether the bucket islocked orunlocked.

If a bucket is pendingdeletion by Cloud Logging,then its table entry is annotated with awarningsymbol.

gcloud

Run thegcloud logging buckets list command:

gcloud logging buckets list

You see the following attributes for the log buckets:

  • LOCATION: Theregionin which the bucket's data is stored.
  • BUCKET_ID: The name of the log bucket.
  • RETENTION_DAYS: The number of days that the bucket's data will bestored by Cloud Logging.
  • LIFECYCLE_STATE: Indicates whether the bucket is pendingdeletion by Cloud Logging.
  • LOCKED: Whether the bucket islocked orunlocked.
  • CREATE_TIME: A timestamp that indicates when the bucket was created.
  • UPDATE_TIME: A timestamp that indicates when the bucket was lastmodified.

You can also view the attributes for just one bucket. For example, to viewthe details for the_Default log bucket in theglobal region, run thegcloud logging buckets describe command:

gcloud logging buckets describe _Default --location=global

REST

To list the log buckets associated with a Google Cloud project, useprojects.locations.buckets.listin the Logging API.

View a bucket's details

To view the details of a single log bucket, do the following:

Google Cloud console

In the Google Cloud console, go to theLogs Storage page:

Go toLogs Storage

If you use the search bar to find this page, then select the result whose subheading isLogging.

On the log bucket, clickMore and thenselectView bucket details.

The dialog lists the following attributes for the log bucket:

  • Name: The name of the log bucket.
  • Description: The description of the log bucket.
  • Retention period: The number of days that the bucket's data willbe stored by Cloud Logging.
  • Region: The geographic location in which the bucket's data is stored.
  • Log Analytics: Indicates whether your bucket is upgraded to useLog Analytics.
  • BigQuery analysis: Indicates whether aBigQuery dataset is linked to your bucket.
  • BigQuery dataset: Provides a link to yourBigQuery dataset, which opens in theBigQuery Studio page.The date that BigQuery linking was enabled is also shown.

gcloud

Run thegcloud logging buckets describe command.

For example, the following command reports the details of the_Defaultbucket:

gcloud logging buckets describe _Default --location=global

You see the following attributes for the log bucket:

  • createTime: A timestamp that indicates when the bucket was created.
  • description: The description of the log bucket.
  • lifecycleState: Indicates whether the bucket is pendingdeletion by Cloud Logging.
  • name: The name of the log bucket.
  • retentionDays: The number of days that the bucket's data will bestored by Cloud Logging.
  • updateTime: A timestamp that indicates when the bucket was lastmodified.

REST

To view the details of a single log bucket, useprojects.locations.buckets.getin the Logging API.

Delete a bucket

You can delete log buckets that satisfy one of the following:

  • The log bucket isunlocked.
  • The log bucket islocked and all log entries inthe log bucket have fulfilled the bucket's retention period.

You can't delete a log bucket that is locked against updateswhen that log bucket stores log entries that haven't fulfilled the bucket'sretention period.

After you issue the delete command, the log bucket transitions to theDELETE_REQUESTED state, and it stays in that statefor 7 days. During this time period, Logging continues toroute logs to the log bucket. You can stop routing logs to the log bucketby deleting or modifying the log sinks that route log entries to the bucket.

You can't create a new log bucket that uses the same name as a log bucketthat is in theDELETE_REQUESTED state.

Note: Log buckets that are in theDELETE_REQUESTED state count toward thequota of buckets per Google Cloud project untilthey are fully deleted.

To delete a log bucket, do the following:

Google Cloud console

To delete a log bucket, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. Locate the bucket that you want to delete, and clickMore.

  3. If theLinked dataset in BigQuery column displays alink, then delete the linked BigQuery dataset:

    1. ClickEdit bucket.

    2. ClearCreate a new BigQuery dataset that links to this bucket, clickDone, and then clickUpdate bucket.

      After you return to theLogs Storage page, clickMore for the bucket you want to delete, thenproceed to the next steps.

  4. SelectDelete bucket.

  5. On the confirmation panel, clickDelete.

  6. On theLogs Storage page, your bucket has an indicator that it'spending deletion. The bucket, including all the logs in it, is deletedafter 7 days.

gcloud

To delete a log bucket, run thegcloud logging buckets delete command:

gcloud logging buckets deleteBUCKET_ID --location=LOCATION

You can't delete a log bucket when that bucket has a linkedBigQuery dataset:

REST

To delete a bucket, useprojects.locations.buckets.deletein the Logging API.

It is an error to delete a log bucket if that bucket has a linkedBigQuery dataset. You must delete the linked dataset beforedeleting the log bucket:

Restore a deleted bucket

You can restore, or undelete, a log bucket that's in the pending deletion state.To restore a log bucket, do the following:

Google Cloud console

To restore a log bucket that is pending deletion, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. For the bucket you want to restore,clickMore, and then selectRestore deleted bucket.

  3. On the confirmation panel, clickRestore.

  4. On theLogs Storage page, the pending-deletion indicator is removedfrom your log bucket.

gcloud

To restore a log bucket that is pending deletion, run thegcloud logging buckets undelete command:

gcloud logging buckets undeleteBUCKET_ID --location=LOCATION

REST

To restore a bucket that is pending deletion, useprojects.locations.buckets.undeletein the Logging API.

Alert on monthly log bytes stored in log buckets

To create an alerting policy, on theLogs Storage page in theGoogle Cloud console, clickCreate usage alert. Thisbutton opens theCreate alerting policy page in Monitoring,and populates the metric type field withlogging.googleapis.com/billing/bytes_ingested.

To create an alerting policy that triggers when the number of log bytes written to your log buckets exceeds your user-defined limit forCloud Logging, use the following settings.

Steps to create an alerting policy.

To create an alerting policy, do the following:

  1. In the Google Cloud console, go to the Alerting page:

    Go toAlerting

    If you use the search bar to find this page, then select the result whose subheading isMonitoring.

  2. If you haven't created your notification channels and if you want to be notified, then clickEdit Notification Channels and add your notification channels. Return to theAlerting page after you add your channels.
  3. From theAlerting page, selectCreate policy.
  4. To select the resource, metric, and filters, expand theSelect a metric menu and then use the values in theNew condition table:
    1. Optional: To limit the menu to relevant entries, enter the resource or metric name in the filter bar.
    2. Select aResource type. For example, selectVM instance.
    3. Select aMetric category. For example, selectinstance.
    4. Select aMetric. For example, selectCPU Utilization.
    5. SelectApply.
  5. ClickNext and then configure the alerting policy trigger. To complete these fields, use the values in theConfigure alert trigger table.
  6. ClickNext.
  7. Optional: To add notifications to your alerting policy, clickNotification channels. In the dialog, select one or more notification channels from the menu, and then clickOK.

    To be notified when incidents are openend and closed, checkNotify on incident closure. By default, notifications are sent only when incidents are openend.

  8. Optional: Update theIncident autoclose duration. This field determines when Monitoring closes incidents in the absence of metric data.
  9. Optional: ClickDocumentation, and then add any information that you want included in a notification message.
  10. ClickAlert name and enter a name for the alerting policy.
  11. ClickCreate Policy.
New condition
Field

Value
Resource and MetricIn theResources menu, selectGlobal.
In theMetric categories menu, selectLogs-based metric.
In theMetrics menu, selectMonthly log bytes ingested.
FilterNone.
Across time series
Time series aggregation
sum
Rolling window60 m
Rolling window functionmax
Configure alert trigger
Field

Value
Condition typeThreshold
Alert triggerAny time series violates
Threshold positionAbove threshold
Threshold valueYou determine the acceptable value.
Retest windowMinimum acceptable value is 30 minutes.

For more information about alerting policies, seeAlerting overview.

Write to a bucket

You don't directly write logs to a log bucket. Rather, you write logs toGoogle Cloud resource: a Google Cloud project, folder, or organization.The sinks in the parent resource then route the logs to destinations, includinglog buckets. A sink routes logs to a log bucket destination when the logs matchthe sink's filter and the sink has permission to route the logs to the logbucket.

Read from a bucket

Each log bucket has a set of log views. To read logs from a log bucket, youneed access to a log view on the log bucket. Log views let you grant a useraccess to only a subset of the logs stored in a log bucket. For informationabout how to configure log views, and how to grant access to specific log views,seeConfigure log views on a log bucket.

To read logs from a log bucket, do the following:

Google Cloud console

  1. In the Google Cloud console, go to theLogs Explorer page:

    Go toLogs Explorer

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. To customize which logs are displayed in the Logs Explorer,clickRefine scope, and then select an option. For example,you can view logs stored in a project or by log view.

  3. ClickApply. TheQuery results pane reloads with logs that matchthe option you selected.

For more information, seeLogs Explorer overview: Refine scope.

gcloud

To read logs from a log bucket, use thegcloud logging read command and addaLOG_FILTER to selectdata:

gcloud logging readLOG_FILTER --bucket=BUCKET_ID --location=LOCATION --view=LOG_VIEW_ID

REST

To read logs from a log bucket, use theentries.list method. SetresourceNames to specify the appropriate bucket and log view, and setfilter to select data.

For detailed information about the filtering syntax, seeLogging query language.

Configure custom retention

Note: EffectiveApril 1, 2023, retention costsapply to logs data retained longer than thedefault retention period ofthe_Default bucket and user-defined log buckets.For pricing details, see the Cloud Logging sections of theGoogle Cloud Observability pricingpage. To review the billable storage for yourlog buckets, go to theLogs Storage page of the Google Cloud console.

When youcreate a log bucket, you have the option tocustomize the period for how long Cloud Logging stores the bucket's logs.You can configure the retention period for any user-defined log bucket and alsofor the_Default log bucket. You can't change the retention period of the_Required log bucket.

If you shorten a bucket's retention, then there is a 7-day grace period inwhich expired logs aren't deleted. You can't query or view those expired logsbut, in those 7 days, you can restore full access by extending the bucket'sretention. Logs stored during the grace period count towards yourretention costs.

Retention enforcement is an eventually-consistent process. If you writelog entries to a log bucket when the log entries are older than the bucket'sretention period, then you might be able to briefly see these log entries.For example, if you send log entries that are 10 days old to a log bucketwith a retention period of 7 days, then those log entries are stored and theneventually purged. These log entries don't contribute to your retentioncosts. They do contribute to your storage costs. To minimize your storagecosts, don't write log entries that are older than your bucket'sretention period.

To update the retention period for a custom log bucket or for the_Default log bucket, do the following:

Google Cloud console

To update a log bucket's retention period, do the following:

  1. In the Google Cloud console, go to theLogs Storage page:

    Go toLogs Storage

    If you use the search bar to find this page, then select the result whose subheading isLogging.

  2. For the bucket you want to update,clickMore, and then selectEdit bucket.

  3. In theRetention field, enter the number of days, between1 day and3650 days, that you want Cloud Logging toretain your logs.

  4. ClickUpdate bucket. Your new retention duration appears in theLogs bucket list.

gcloud

To update the retention period for a log bucket, run thegcloud logging buckets update command, after setting a value forRETENTION_DAYS:

gcloud logging buckets updateBUCKET_ID  --location=LOCATION --retention-days=RETENTION_DAYS

For example, to retain the logs in the_Default bucket in theglobal location for a year, your command would look like the following:

gcloud logging buckets update _Default --location=global --retention-days=365

If you extend a bucket's retention period, then the retention rules apply goingforward and not retroactively. Logs can't be recovered after the applicableretention period ends.

Asynchronous API methods

The response of an asynchronous method likeprojects.locations.buckets.createAsyncis anOperation object.

Applications that call an asynchronous API method should polltheoperation.get endpoint until thevalue of theOperation.done field istrue:

  • Whendone isfalse, the operation is in progress.

    To refresh the status information, send aGET request to theoperation.get endpoint.

  • Whendone istrue, the operation is complete and either theerror orresponse field is set:

    • error: When set, the asynchronous operationfailed. The value of this field is aStatus object thatcontains a gRPC error code and an error message.
    • response: When set, the asynchronous operation completed successfully,and the value reflects the result.

To poll an asynchronous command by using the Google Cloud CLI, run the followingcommand:

gcloud logging operations describeOPERATION_ID --location=LOCATION --project=PROJECT_ID

For more information, seegcloud logging operations describe.

Troubleshoot common issues

If you encounter problems when using log buckets, refer to the followingtroubleshooting steps and answers to common questions.

Why can't I delete this bucket?

If you're trying to delete a bucket, do the following:

  • Verify that you have the correct permissions to delete the bucket. For thelist of the permissions that you need, seeAccess control with IAM.

  • Determine whether the bucket is locked bylisting the bucket's attributes. If the bucket islocked, check the bucket'sretention period. You can't delete a locked bucket untilall of the logs in the bucket have fulfilled the bucket's retention period.

  • Verify that the log bucket doesn't have a linked BigQuery dataset.You can't delete a log bucket with a linked dataset.

    The following error is shown in response to adelete command on alog bucket that has a linked dataset:

    FAILED_PRECONDITION: This bucket is used for advanced analytics and has an active link. The link must be deleted first before deleting the bucket

    To list the links associated with a log bucket, run thegcloud logging links list command or run theprojects.locations.buckets.links.listAPI method.

Which service accounts are routing logs to my bucket?

To determine if any service accounts have IAM permissions toroute logs to your bucket, do the following:

  1. In the Google Cloud console, go to theIAM page:

    Go toIAM

    If you use the search bar to find this page, then select the result whose subheading isIAM & Admin.

  2. From thePermissions tab, view byRoles. You see a table with allthe IAM roles and principals associated with yourGoogle Cloud project.

  3. In the table'sFiltertext box,enterLogs Bucket Writer.

    You see any principals with theLogs Bucket Writer role. If a principalis a service account, its ID contains the stringgserviceaccount.com.

  4. Optional: If you want to remove a service account from being able to routelogs to your Google Cloud project, select thecheck boxfor the service account and clickRemove.

Why do I see logs for a Google Cloud project even though I excluded them from my_Default sink?

You might be viewing logs in a log bucket in acentralized Google Cloud project, whichaggregates logs from across your organization.

If you're using the Logs Explorer to access these logs and see logs that youexcluded from the_Default sink, then your view might be set to theGoogle Cloud project level.

To fix this issue, selectLog view in theRefine scope menuand then select the log view associated with the_Default bucket in yourGoogle Cloud project. You shouldn't see the excluded logs anymore.

What's next

For information on the log bucket API methods, refer to theLogBucket reference documentation.

If you manage an organization or a folder, then you can specify the location ofthe_Default and_Required log buckets of child resources. You can alsoconfigure whether log buckets use CMEK and the behavior of the_Default log sink. For more information, seeConfigure default settings for organizations and folders.

For information on addressing common use cases with log buckets, see thefollowing topics:

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.