Usage logs & storage logs

This document discusses how to download and review usage logs and storageinformation for your Cloud Storage buckets, and analyze the logs usingGoogle BigQuery.

Introduction

Cloud Storage offers usage logs and storage logs in the form of CSVfiles that you can download and view.Usage logs provide information for allof the requests made on a specified bucket and are created hourly.Storagelogs provide information about the storage consumption of that bucket for thelast day and are created daily.

Once set up, both usage logs and storage logs are automatically generated forthe specified bucket and stored as new objects in a bucket that you specify.

Usage logs and storage logs are subject tothe samepricing as other objects stored in Cloud Storage.

Caution: Timeliness and completeness of usage and storage logs delivery is notguaranteed.

Should you use usage logs or Cloud Audit Logs?

In most cases,Cloud Audit Logs is the recommended method forgenerating logs that track API operations performed in Cloud Storage:

  • Cloud Audit Logs tracks access on a continuous basis, with delivery of eventswithin seconds of their occurrence.
  • Cloud Audit Logs produces logs that are easier to work with.
  • Cloud Audit Logs can monitor many of your Google Cloud services, not justCloud Storage.
  • Cloud Audit Logs can, optionally, log detailed request and response information.

In some cases, you might want to use usage logs instead of or in addition tousing Cloud Audit Logs. You most likely want to use usage logs if:

  • You want to track access that occurs because a resource hasallUsersorallAuthenticatedUsers in its access control settings, such as access toassets in a bucket that you've configured to be a static website.
  • You want to track changes made by theObject Lifecycle Management orAutoclass features.
  • You want your logs to include latency information, the request andresponse size of individual HTTP requests, or the full URL path and everyquery parameter.
  • You want to track access to only certain buckets in your project and so do notwant to enable Data Access audit logs, which tracks access to all buckets inyour project.
Note: While usage logs are generated hourly, they might be delayed, especiallyfor buckets that experience high request rates. Logs are generated at leastevery six hours.

Should you use storage logs or Monitoring?

Generally, you should not use storage logs. The recommended tool for measuringstorage consumption is Monitoring, which provides visualizationtools as well as additional metrics related to storage consumption that storagelogs do not. See theConsole tab for determining a bucket's size forstep-by-step instructions on using Monitoring.

Set up log delivery

Before setting up log delivery, you must have a bucket for storing logs. Thisbucket must meet the following requirements, or else logging fails:

  • The bucket storing the logs must exist within the sameorganization asthe bucket being logged.

    • If the bucket being logged is not contained in any organization, thebucket storing the logs must exist within the same project as the bucketbeing logged.
  • If you use or enableVPC Service Controls, the bucket storing the logsmust reside within the same security perimeter as the bucket being logged.

If you don't already have a bucket that meets these requirements,create the bucket.

The following steps describe how to set up log delivery for a bucket:

Command line

  1. Grant Cloud Storage theroles/storage.objectCreatorrole for the bucket:

    gcloud storage buckets add-iam-policy-bindinggs://example-logs-bucket --member=group:cloud-storage-analytics@google.com --role=roles/storage.objectCreator

    The role gives Cloud Storage, in the form of the groupcloud-storage-analytics@google.com, permission to create and storeyour logs as new objects.

    Log objects have thedefault object acl of the log bucket,unlessuniform bucket-level access is enabled on the bucket.

  2. Enable logging for your bucket using the--log-bucket flag:

    gcloud storage buckets updategs://example-bucket --log-bucket=gs://example-logs-bucket [--log-object-prefix=log_object_prefix]

    Optionally, you can set an object prefix for your log objects byusing the--log-object-prefix flag. The object prefix forms thebeginning of the log object name. It can be at most 900 charactersand must be avalid object name. By default, the object prefixis the name of the bucket for which the logs are enabled.

REST APIs

JSON API

  1. Grant Cloud Storage theroles/storage.objectCreatorrole for the bucket. If there are additional bucket-levelIAM bindings for the bucket, be sure to include themin the request.

    POST/storage/v1/b/example-logs-bucket/iamHost:storage.googleapis.com{"bindings":[{"role":"roles/storage.objectCreator","members":["group-cloud-storage-analytics@google.com"]}]}

    The role gives Cloud Storage, in the form of the groupcloud-storage-analytics@google.com, permission to create and storeyour logs as new objects.

    Log objects have thedefault object acl of the log bucket,unlessuniform bucket-level access is enabled on the bucket.

  2. Enable logging for your bucket using the following request:

    PATCH/storage/v1/b/example-bucketHost:storage.googleapis.com{"logging":{"logBucket":"example-logs-bucket","logObjectPrefix":"log_object_prefix"}}

XML API

Note: If you useuniform bucket-level access on the bucket storingyour logs, you cannot use the XML API to set access permission on it.Use a different tool instead.
  1. Set permissions to allow Cloud StorageWRITE permissionto the bucket in order to create and store your logs as new objects.You must add an ACL entry for the bucket that grants the groupcloud-storage-analytics@google.com write access. Be sure toinclude all existing ACLs for the bucket, in addition to the newACL, in the request.

    PUT /example-logs-bucket?acl HTTP/1.1Host: storage.googleapis.com<AccessControlList>  <Entries>    <Entry>      <Scope type="GroupByEmail">        <EmailAddress>cloud-storage-analytics@google.com</EmailAddress>      </Scope>     <Permission>WRITE</Permission>    </Entry>    <!-- include other existing ACL entries here-->  </Entries></AccessControlList>
  2. Enable logging for your bucket using the logging query parameter:

    PUT /example-bucket?logging HTTP/1.1Host: storage.googleapis.com<Logging>    <LogBucket>example-logs-bucket</LogBucket>    <LogObjectPrefix>log_object_prefix</LogObjectPrefix></Logging>

Check logging status

Command line

Check logging by using thebuckets describe command with the--format flag:

gcloud storage buckets describegs://example-bucket --format="default(logging_config)"

You can also save the logging configurations to a file:

gcloud storage buckets describe gs://example-bucket >your_logging_configuration_file --format="default(logging_config)"

If logging is enabled, the server returns the logging configuration inthe response:

logging:  logBucket: example-logs-bucket  logObjectPrefix: log_object_prefix

If logging is not enabled, the following is returned:

null

REST APIs

JSON API

Send aGET request for the bucket's logging configuration as shownin the following example:

GET/storage/v1/b/example-bucket?fields=loggingHost:storage.googleapis.com

If logging is enabled, the server sends the configuration in theresponse. A response might look similar to the following:

{ "logging": {  "logBucket": "example-logs-bucket",  "logObjectPrefix": "log_object_prefix"  }}

If logging is not enabled, an empty configuration is returned:

{}

XML API

Send aGET Bucket request for the bucket's logging configurationas shown in the following example:

GET /example-bucket?logging HTTP/1.1Host: storage.googleapis.com

If logging is enabled, the server sends the configuration in theresponse. A response might look similar to the following:

<?xml version="1.0" ?><Logging>    <LogBucket>        example-logs-bucket    </LogBucket>    <LogObjectPrefix>        log_object_prefix    </LogObjectPrefix></Logging>

If logging is not enabled, an empty configuration is returned:

<?xml version="1.0" ?><Logging/>

Download logs

Storage logs are generated once a day and contain the amount of storage usedfor the previous day. They are typically created before 10:00 am PST.

Usage logs are generated hourly when there is activity to report in themonitored bucket. Usage logs are typically created 15 minutes after the endof the hour.

Note:
  • Any log processing of usage logs should take into account the possibility that they may be delivered later than 15 minutes after the end of an hour.
  • Usually, hourly usage log object(s) contain records for all access that occurred during that hour. Occasionally, an hourly usage log object contains records for an earlier hour, but never for a later hour.
  • Cloud Storage may write multiple log objects for the same hour.
  • Occasionally, a single record may appear twice in the usage logs. While we make our best effort to remove duplicate records, your log processing should be able to remove them if it is critical to your log analysis. You can use thes_request_id field to detect duplicates.

The easiest way to download your usage logs and storage logs from the bucket inwhich they are stored is either through theGoogle Cloud console or thegcloud storage CLI. Your usage logs are in CSV format and have thefollowing naming convention:

OBJECT_PREFIX_usage_TIMESTAMP_ID_v0

Similarly, storage logs are named using the following convention:

OBJECT_PREFIX_storage_TIMESTAMP_ID_v0

For example, the following is the name of a usage log object that uses thedefault object prefix, reports usage for the bucket namedexample-bucket, andwas created on June 18, 2022 at 14:00 UTC:

example-bucket_usage_2022_06_18_14_00_00_1702e6_v0

Similarly, the following is the name of the storage log object that uses thedefault object prefix and was created on June 18, 2022 for the same bucket:

example-bucket_storage_2022_06_18_07_00_00_1702e6_v0

To download logs:

Console

  1. In the Google Cloud console, go to the Cloud StorageBuckets page.

    Go to Buckets

  2. Select the bucket in which your logs are stored.

  3. Download or view your logs by clicking on the appropriate log object.

Command line

Run the following command:

gcloud storage cp gs://BUCKET_NAME/LOGS_OBJECTDESTINATION

Where:

  • BUCKET_NAME is the name of the bucketin which the logs are stored. For example,example-logs-bucket.

  • LOGS_OBJECT is the name of the usage log orstorage log that you are downloading. For example,example-bucket_usage_2022_06_18_14_00_00_1702e6_v0.

  • DESTINATION is the location to which the logis being downloaded. For example,Desktop/Logs.

Analyze logs in BigQuery

To query your Cloud Storage usage and storage logs, you can useGoogle BigQuery which enables fast, SQL-like queries against append-onlytables. The BigQuery Command-Line Tool,bq, is a Python-based tool thatallows you to access BigQuery from the command line. For information aboutdownloading and using bq, see thebq Command-Line Tool reference page.

Load logs into BigQuery

  1. Select a default project.

    For details about selecting a project, seeWorking With Projects.

  2. Create a new dataset.

    $ bq mk storageanalysisDataset 'storageanalysis' successfully created.
  3. List the datasets in the project:

    $ bq ls datasetId-----------------storageanalysis
  4. Save the usage and storage schemas to your local computer for use in theload command.

    You can find the schemas to use at these locations:cloud_storage_usage_schema_v0 andcloud_storage_storage_schema_v0. The schemas are also described in thesectionUsage and Storage Logs Format.

  5. Load the usage logs into the dataset.

    $ bq load --skip_leading_rows=1 storageanalysis.usage \      gs://example-logs-bucket/example-bucket_usage_2014_01_15_14_00_00_1702e6_v0 \      ./cloud_storage_usage_schema_v0.json$ bq load --skip_leading_rows=1 storageanalysis.storage \      gs://example-logs-bucket/example-bucket_storage_2014_01_05_14_00_00_091c5f_v0 \      ./cloud_storage_storage_schema_v0.json

    These commands do the following:

    • Load usage and storage logs from the bucketexample-logs-bucket.
    • Create tablesusage andstorage in the datasetstorageanalysis.
    • Read schema data (.json file) from the same directory where the bq command runs.
    • Skip the first row of each log file because it contains column descriptions.

    Because this was the first time you ran the load command in the examplehere, the tablesusage andstorage were created. You could continueto append to these tables with subsequent load commands with differentusage log file names or using wildcards. For example, the followingcommand appends data from all logs that start with "bucket_usuage_2014",to thestorage table:

    $ bq load --skip_leading_rows=1 storageanalysis.usage \gs://example-logs-bucket/bucket_usage_2014* \      ./cloud_storage_usage_schema_v0.json

    When using wildcards, you might want to move logs already uploaded toBigQuery to another directory (e.g.,gs://example-logs-bucket/processed)to avoid uploading data from a log more than once.

BigQuery functionality can also be accessed through theBigQuery Browser Tool.With the browser tool, you can load data through the create table process.

For additional information about loading data from Cloud Storage, includingprogrammatically loading data, seeLoading data from Cloud Storage.

Modify the usage log schema

In some scenarios, you may find it useful to pre-process usage logs beforeloading into BigQuery. For example, you can add additional information to theusage logs to make your query analysis easier in BigQuery. In this section,we'll show how you can add the file name of each storage usage log to the log.This requires modifying the existing schema and each log file.

  1. Modify the existing schema,cloud_storage_storage_schema_v0, to add file name as shown below. Givethe new schema a new name, for example,cloud_storage_storage_schema_custom.json,to distinguish from the original.

    [  {"name": "bucket", "type": "string", "mode": "REQUIRED"},{"name": "storage_byte_hours","type": "integer","mode": "REQUIRED"},{"name": "filename","type": "string","mode": "REQUIRED"}]
  2. Pre-process storage usage log files based on the new schema, beforeloading them into BigQuery.

    For example, the following commands can be used in a Linux, macOS, orWindows (Cygwin) environment:

    gcloud storage cp gs://example-logs-bucket/example-bucket_storage\* .for f in example-bucket_storage\*; do sed -i -e "1s/$/,\"filename\"/" -e "2s/$/,\""$f"\"/" $f; done

    Thegcloud storage command copies the files into your working directory.The second command loops through the log files and adds "filename" to thedescription row (first row) and the actual file name to the data row(second row). Here's an example of a modified log file:

    "bucket","storage_byte_hours","filename""example-bucket","5532482018","example-bucket_storage_2014_01_05_08_00_00_021fd_v0"
  3. When you load the storage usage logs into BigQuery, load your locallymodified logs and use the customized schema.

    for f in example-bucket_storage\*; \do ./bq.py load --skip_leading_rows=1 storageanalysis.storage $f ./cloud_storage_storage_schema_custom.json; done

Query logs in BigQuery

Once your logs are loaded into BigQuery, you can query your usage logs toreturn information about your logged bucket(s). The following example shows youhow to use the bq tool in a scenario where you have usage logs for a bucketover several days and you have loaded the logs as shown inLoading usage logs into BigQuery. You can also execute the queriesbelow using theBigQuery Browser Tool.

  1. In the bq tool, enter the interactive mode.

    $ bq shell
  2. Run a query against the storage log table.

    For example, the following query shows how the storage of a logged bucketchanges in time. It assumes that you modified the storage usage logs asdescribed inModifying the usage log schema and that the log filesare named "logstorage*".

    project-name>SELECT SUBSTRING(filename, 13, 10) as day, storage_byte_hours/24 as size FROM [storageanalysis.storage] ORDER BY filename LIMIT 100

    Example output from the query:

    Waiting on bqjob_r36fbf5c164a966e8_0000014379bc199c_1 ... (0s) Current status: DONE+------------+----------------------+|    day     |         size         |+------------+----------------------+| 2014_01_05 | 2.3052008408333334E8 || 2014_01_06 | 2.3012297245833334E8 || 2014_01_07 | 3.3477797120833334E8 || 2014_01_08 | 4.4183686058333334E8 |+-----------------------------------+

    If you did not modify the schema and are using the default schema, you canrun the following query:

    project-name>SELECT storage_byte_hours FROM [storageanalysis.storage] LIMIT 100
  3. Run a query against the usage log table.

    For example, the following query shows how to summarize the request methodsthat clients use to access resources in the logged bucket.

    project-name>SELECT cs_method, COUNT(*) AS count FROM [storageanalysis.usage] GROUP BY cs_method

    Example output from the query:

    Waiting on bqjob_r1a6b4596bd9c29fb_000001437d6f8a52_1 ... (0s) Current status: DONE+-----------+-------+| cs_method | count |+-----------+-------+| PUT       |  8002 || GET       | 12631 || POST      |  2737 || HEAD      |  2173 || DELETE    |  7290 |+-----------+-------+
  4. Quit the interactive shell of the bq tool.

    project-name> quit

Disable logging

Command line

Disable logging with the--clear-log-bucket flag in thebuckets update command:

gcloud storage buckets updategs://example-bucket --clear-log-bucket

To check that logging was successfully disabled, use thebuckets describe command:

gcloud storage buckets describegs://example-bucket --format="default(logging_config)"

If logging is disabled, the following is returned:

null

REST APIs

JSON API

Disable logging by sending aPATCH request to the bucket's loggingconfiguration as shown in the following example.

PATCH/example-bucket?loggingHTTP/1.1Host:storage.googleapis.com{"logging":null}

XML API

Disable logging by sending aPUT request to the bucket's loggingconfiguration as shown in the following example:

PUT /example-bucket?logging HTTP/1.1Host: storage.googleapis.com<Logging/>

Usage and storage log format

The usage logs and storage logs can provide an overwhelming amount ofinformation. You can use the following tables to help you identify all theinformation provided in these logs.

Usage log fields:

FieldTypeDescription
time_microsintegerThe time that the request was completed, in microseconds since theUnix epoch.
c_ipstringThe IP address from which the request was made. The "c" prefix indicates that this is information about the client.
c_ip_typeintegerThe type of IP in the c_ip field:
  • A value of1 indicates an IPV4 address.
  • A value of2 indicates an IPV6 address.
c_ip_regionstringReserved for future use.
cs_methodstringThe HTTP method of this request. The "cs" prefix indicates that this information was sent from the client to the server.
cs_uristringThe URI of the request.
sc_statusintegerThe HTTP status code the server sent in response. The "sc" prefix indicates that this information was sent from the server to the client.
cs_bytesintegerThe number of bytes sent in the request.
sc_bytesintegerThe number of bytes sent in the response.
time_taken_microsintegerThe time it took to serve the request in microseconds, measured from when the first byte is received to when the response is sent. Note that for resumable uploads, the ending point is determined by the response to the final upload request that was part of the resumable upload.
cs_hoststringThe host in the original request.
cs_refererstringTheHTTP referrer for the request.
cs_user_agentstringTheUser-Agent of the request. The value isGCS Lifecycle Management for requests made bylifecycle management.
s_request_idstringThe request identifier.
cs_operationstringThe Cloud Storage operation e.g.GET_Object. This can be null.
cs_bucketstringThe bucket specified in the request.
cs_objectstringThe object specified in this request. This can be null.

Storage log fields:

FieldTypeDescription
bucketstringThe name of the bucket.
storage_byte_hoursintegerAverage size in byte-hours over a 24 hour period of the bucket.To get the total size of the bucket, divide byte-hours by 24.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.