Cloud Logging for Storage Transfer Service

This page describes how to configure and view Cloud Logging forStorage Transfer Service logs.

Cloud Logging for Storage Transfer Service is supported for all transfers.FINDoperations are not logged for agent-based transfers.

File system transfers can additionally configurefile system transfer logs.

Before you begin

Before you begin, verify that you have access to Cloud Logging. We recommendtheLogs Viewer (roles/logging.viewer) Identity and Access Management role.For more information on Logging access, seeAccess controlwith IAM.

The following describe how to verify and grant IAM access:

Loggable actions

The following actions can be logged:

  • FIND: Finding work to do, such as listing files in a directory, listingobjects in a bucket, or listing managed folders in a bucket. Not supported foragent-based transfers.
  • COPY: Copying files or objects to Cloud Storage.
  • DELETE: Deleting files or objects at the source or the destination. Fortransfers between two file systems, also logs the deletion of files from theintermediary Cloud Storage bucket.

Loggable states

For each action, you can choose to log one or more of the following states:

  • SUCCEEDED: The action was successful.
  • FAILED: The action failed.
  • SKIPPED: Only applies to the COPY action, and only supported for agent-basedtransfer jobs. Must be set usinggcloud or REST API. This state means thatthe copy was skipped. This occurs when the file already exists in the sink,and your transfer job is configured to ignore existing files.

Enable logging

To enable logging, specify the actions and the states to log.

gcloud CLI

When creating a transfer job withgcloud transfer jobs create, use thefollowing flags to enable logging:

gcloudtransferjobscreateSOURCEDESTINATION\--log-actions=copy,delete,find\--log-action-states=succeeded,failed,skipped

You must specify at least one value for each flag.

REST

To create a logging configuration, usetransferJobs.create with aLoggingConfig:

{"name":"transferJobs/myFirstTransfer","status":"ENABLED","projectId":"test-id-001","loggingConfig":{"logActions":["FIND","DELETE","COPY"],"logActionStates":["SUCCEEDED","FAILED","SKIPPED"],#SKIPPEDisonlysupportedforagent-basedtransfers},"transferSpec":{"awsS3DataSource":{"bucketName":"AWS_SOURCE_NAME","awsAccessKey":{"accessKeyId":"AWS_ACCESS_KEY_ID","secretAccessKey":"AWS_SECRET_ACCESS_KEY"}},"gcsDataSink":{"bucketName":"destination_bucket","path":"foo/bar/"},}}

AdjustloggingConfig to include the specificlogActions andlogActionStates to log. For example, to log when copy and find actions fail,provide the followingloggingConfig:

"loggingConfig":{"logActions":["COPY","FIND"],"logActionStates":["FAILED"],}

Update a logging configuration

gcloud CLI

To update an existing job's logging configuration, use the appropriateflags with thegcloud transfer jobs update command:

gcloudtransferjobsupdateNAME\--log-actions=copy,delete,find\--log-action-states=succeeded,failed,skipped

To disable logging for this job, specify--clear-log-config:

gcloudtransferjobsupdateNAME--clear-log-config

REST

To update an existing transfer job's logging configuration, usetransferJobs.patch withLoggingConfig:

{"projectId":"test-id-001","transferJob":{"loggingConfig":{"logActions":["FIND","DELETE","COPY"],"logActionStates":["SUCCEEDED","FAILED","SKIPPED"],#SKIPPEDisonlysupportedforagent-basedtransfers},},"updateTransferJobFieldMask":"loggingConfig"}

TheupdateTransferJobFieldMask specifies the field that is being updated inthis request and is required.

To disable logging for this job, send aloggingConfig with empty lists forlogActions andlogActionStates:

{"projectId":"test-id-001","transferJob":{"loggingConfig":{"logActions":[],"logActionStates":[],},},"updateTransferJobFieldMask":"loggingConfig"}

View logs

To view transfer logs, do the following:

Google Cloud console

  1. Go to the Google Cloud navigation menu and selectLogging > Logs Explorer:

    Go to the Logs Explorer

  2. Select a Google Cloud project.

  3. From theUpgrade menu, switch fromLegacy Logs Viewer toLogs Explorer.

  4. To filter your logs to show only Storage Transfer Service entries, typestorage_transfer_job into the query field and clickRun query.

  5. In theQuery results pane, clickEdit time to change the timeperiod for which to return results.

For more information on using the Logs Explorer, seeUsing theLogs Explorer.

gcloud CLI

To use the gcloud CLI to search for Storage Transfer Servicelogs, use thegcloud logging readcommand.

Specify a filter to limit your results to Storage Transfer Service logs.

gcloudloggingread"resource.type=storage_transfer_job"

Cloud Logging API

Use theentries.listCloud Logging API method.

To filter your results to include only Storage Transfer Service-related entries,use thefilter field. A sample JSON request object is below.

{"resourceNames":["projects/my-project-name"],"orderBy":"timestamp desc","filter":"resource.type=\"storage_transfer_job\""}

Transfer log format

The following section describes the fields for Storage Transfer Service logs.

All Storage Transfer Service-specific fields are contained within ajsonPayload object.

FIND actions

{"jsonPayload":{"@type":"type.googleapis.com/google.storagetransfer.logging.TransferActivityLog","action":"FIND","completeTime":"2021-12-16T18:58:49.344509695Z","destinationContainer":{"gcsBucket":{"bucket":"my-bucket-2",},"type":"GCS",},"operation":"transferOperations/transferJobs-7876027868280507149--3019866490856027148","sourceContainer":{"gcsBucket":{"bucket":"my-bucket-1"},"type":"GCS"},"status":{"statusCode":"OK"}}}

COPY andDELETE actions

{"jsonPayload":{"@type":"type.googleapis.com/google.storagetransfer.logging.TransferActivityLog","action":"COPY","completeTime":"2021-12-16T18:59:00.510509049Z","destinationObject":{"gcsObject":{"bucket":"my-bucket-2","objectKey":"README.md"},"type":"GCS",},"operation":"transferOperations/transferJobs-7876027868280507149--3019866490856027148","sourceObject":{"gcsObject":{"bucket":"my-bucket-1","lastModifiedTime":"2021-12-07T16:41:09.456Z","md5":"WgnCOIdfCXNTUDpQJSKb2w==","objectKey":"README.md",},"type":"GCS",},"status":{"statusCode":"OK"}}}
Log fieldDescription
@typeThe value is alwaystype.googleapis.com/google.storagetransfer.logging.TransferActivityLog.
action

Describes the action of this particular task. One of the following:

  • FIND: Finding work to do, such as listing files in a directory or listing objects in a bucket. Not reported for agent-based transfers.
  • COPY: Copying files or objects to Cloud Storage.
  • DELETE: Deleting files or objects at the source, destination, or intermediary bucket.
findAction

Specifies whether the subject of the find action was an object or amanaged folder.

completeTimeThe ISO 8601-compliant timestamp at which the operation completed.
destinationContainer

Only present forFIND operations.FIND operations are not logged for agent-based transfers.

The destination container for this transfer. Contains two sub-fields:

  • gcsBucket.bucket: The destination Cloud Storage bucket name.
  • type: AlwaysGCS.
destinationObject

Only present forCOPY andDELETE operations.

Information about the object at the destination. Contains two sub-fields:

  • One ofgcsObject,gcsManagedFolder, orposixFile, depending on the destination. All options contain multiple sub-fields that specify name, location, date/time info, and the object or file's hash.
  • type is one ofGCS orPOSIX_FS.

For example:

"destinationObject":{"type":"POSIX_FS","posixFile":{"crc32c":"0","path":"/tmp/data/filename.txt","lastModifiedTime":"2022-09-22T04:33:45Z"}}
operationThe fully-qualifiedtransferOperations name.
sourceContainer

Only present forFIND operations.FIND operations are not logged for agent-based transfers.

The source container for this transfer. Contains two sub-fields:

  • An entry specifying the source location. The field is named according to the source type. Possible fields are as follows.
    • awsS3Bucket.bucket: The AWS S3 bucket name.
    • azureBlobContainer: Contains sub-fieldsaccount andcontainer, which together define the Microsoft Azure Blob storage URI.
    • gcsBucket.bucket: The Cloud Storage bucket name.
    • httpManifest.url: The URL of aURL list that specifies publicly-available files to download from an HTTP(S) server.
  • type is one ofAWS_S3,AZURE_BLOB,GCS, orHTTP.

For example:

"sourceContainer":{"gcsBucket":{"bucket":"my-bucket-1"}type:"GCS"}
sourceObject

Only present forCOPY andDELETE operations.

Information about the source object. Contains two sub-fields:

  • An entry specific to the source object's host. The field is named according to the source type and contains subfields for metadata. Possible fields are as follows.
    • awsS3Object: An AWS S3 object.
    • azureBlob: A file in Azure Blob Storage.
    • gcsObject: A Cloud Storage object.
    • gcsManagedFolder: A Cloud Storage managed folder.
    • httpFile: A file specified by aURL list.
    • posixFile: A file on a POSIX file system.
  • type is one ofAWS_S3,AZURE_BLOB,GCS,HTTP, orPOSIX_FS.

For example:

"sourceObject":{"gcsObject":{"bucket":"my-bucket-1""lastModifiedTime":"2021-12-07T16:41:09.456Z""md5":"WgnCOIdfCXNTUDpQJSKb2w==""objectKey":"README.md"}type:"GCS"}
status

The status of the action. Ifstatus.statusCode isOK, the action succeeded. Otherwise, the action failed. Thestatus.errorType andstatus.errorMessage fields are only populated if the status is notOK.

In addition, the top-levelresource field contains the following fields.

"resource":{"labels":{"job_id":"transferJobs/7876027868280507149""project_id":"my-project-id"}"type":"storage_transfer_job"}
Log fieldDescription
resource.labels.job_idThe Storage Transfer Service job name to which this log belongs.
resource.labels.project_idThe Google Cloud project ID for this transfer.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.