Event-driven transfers from AWS S3

Storage Transfer Service can listen to event notifications in AWSto automatically transfer data that has been added or updated in the sourcelocation, into a Cloud Storage bucket.Learn more about the benefitsof event-driven transfers.

Event-driven transfers listen toAmazon S3 Event Notifications sent to Amazon SQS to know when objects in the source bucket have been modifiedor added. Object deletions are not detected; deleting an object at the sourcedoes not delete the associated object in the destination bucket.

Event-driven transfers always use a Cloud Storage bucket as the destination.

Before you begin

Follow the instructions to grant the required permissions on yourdestination Cloud Storage bucket:

Create an SQS queue

  1. In the AWS console, go to theSimple Queue Service page.

  2. ClickCreate queue.

  3. Enter aName for this queue.

  4. In theAccess policy section, selectAdvanced. A JSON object isdisplayed:

    {"Version":"2008-10-17","Id":"__default_policy_ID","Statement":[{"Sid":"__owner_statement","Effect":"Allow","Principal":{"AWS":"01234567890"},"Action":["SQS:*"],"Resource":"arn:aws:sqs:us-west-2:01234567890:test"}]}

    The values ofAWS andResource are unique for each project.

  5. Copy your specific values ofAWS andResource from the displayed JSONinto the following JSON snippet:

    {"Version":"2012-10-17","Id":"example-ID","Statement":[{"Sid":"example-statement-ID","Effect":"Allow","Principal":{"Service":"s3.amazonaws.com"},"Action":"SQS:SendMessage","Resource":"RESOURCE","Condition":{"StringEquals":{"aws:SourceAccount":"AWS"},"ArnLike":{"aws:SourceArn":"S3_BUCKET_ARN"}}}]}

    The values of the placeholders in the preceding JSON use the followingformat:

    • AWS is a numeric value representing your Amazon Web Servicesproject. For example,"aws:SourceAccount": "1234567890".
    • RESOURCE is an Amazon Resource Number (ARN) that identifiesthis queue. For example,"Resource": "arn:aws:sqs:us-west-2:01234567890:test".
    • S3_BUCKET_ARN is an ARN that identifies the source bucket. Forexample,"aws:SourceArn": "arn:aws:s3:::example-aws-bucket". You canfind a bucket's ARN from theProperties tab of the bucket details pagein the AWS console.
  6. Replace the JSON displayed in theAccess policy section with the updatedJSON above.

  7. ClickCreate queue.

Once complete, note the Amazon Resource Name (ARN) of the queue. The ARN has thefollowing format:

arn:aws:sqs:us-east-1:1234567890:event-queue"

Enable notifications on your S3 bucket

  1. In the AWS console, go to theS3 page.

  2. In theBuckets list, select your source bucket.

  3. Select theProperties tab.

  4. In theEvent notifications section, clickCreate event notification.

  5. Specify a name for this event.

  6. In theEvent types section, selectAll object create events.

  7. As theDestination selectSQS queue and select the queue you createdfor this transfer.

  8. ClickSave changes.

Configure permissions

Follow the instructions inConfigure access to a source: Amazon S3to create either an access key ID and secret key, or a Federated Identityrole.

Replace the custom permissions JSON with the following:

{"Version":"2012-10-17","Statement":[{"Effect":"Allow","Action":["sqs:DeleteMessage","sqs:ChangeMessageVisibility","sqs:ReceiveMessage","s3:GetObject","s3:ListBucket"],"Resource":["arn:aws:s3:::S3_BUCKET_NAME","arn:aws:s3:::S3_BUCKET_NAME/*","AWS_QUEUE_ARN"]}]}

Once created, note the following information:

  • For a user, note the access key ID and secret key.
  • For a Federated Identity role, note the Amazon Resource Name (ARN),which has the formatarn:aws:iam::AWS_ACCOUNT:role/ROLE_NAME.

Create a transfer job

You can use the REST API or the Google Cloud console to create an event-basedtransfer job.

Cloud console

  1. Go to theCreate transfer job page in the Google Cloud console.

    Go toCreate transfer job

  2. SelectAmazon S3 as the source type, andCloud Storage asthe destination.

  3. As theScheduling mode selectEvent-driven and clickNext step.

  4. Enter your S3 bucket name. The bucket name is the name as it appears in theAWS Management Console. Forexample,my-aws-bucket.

  5. Select your authentication method and enter the requested information,which you created and noted in the previous section.

  6. Enter the Amazon SQS queue ARN that you created earlier. It uses thefollowing format:

    arn:aws:sqs:us-east-1:1234567890:event-queue"
  7. Optionally, define any filters, then clickNext step.

  8. Select the destination Cloud Storage bucket and, optionally, path.

  9. Optionally, enter a start and end time for the transfer. If you don'tspecify a time, the transfer will start immediately and will run untilmanually stopped.

  10. Specify any transfer options. More information is available from theCreate transfers page.

  11. ClickCreate.

Once created, the transfer job starts running and an event listener waits fornotifications on the SQS queue. The job details page shows oneoperation each hour, and includes details on data transferred for each job.

REST

To create an event-driven transfer using the REST API, send the followingJSON object to thetransferJobs.create endpoint:

transfer_job{"description":"YOUR DESCRIPTION","status":"ENABLED","projectId":"PROJECT_ID","transferSpec"{"awsS3DataSource"{"bucketName":"AWS_SOURCE_NAME","roleArn":"arn:aws:iam::1234567891011:role/role_for_federated_auth"},"gcsDataSink":{"bucketName":"GCS_SINK_NAME"}}"eventStream"{"name":"arn:aws:sqs:us-east-1:1234567891011:s3-notification-queue","eventStreamStartTime":"2022-12-02T01:00:00+00:00","eventStreamExpirationTime":"2023-01-31T01:00:00+00:00"}}

TheeventStreamStartTime andeventStreamExpirationTime are optional.If the start time is omitted, the transfer starts immediately; if the endtime is omitted, the transfer continues until manually stopped.

Client libraries

Go

To learn how to install and use the client library for Storage Transfer Service, seeStorage Transfer Service client libraries. For more information, see theStorage Transfer ServiceGo API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

funccreateEventDrivenAWSTransfer(wio.Writer,projectIDstring,s3SourceBucketstring,gcsSinkBucketstring,sqsQueueARNstring)(*storagetransferpb.TransferJob,error){// Your Google Cloud Project ID.// projectID := "my-project-id"// The name of the source AWS S3 bucket.// s3SourceBucket := "my-source-bucket"// The name of the GCS bucket to transfer objects to.// gcsSinkBucket := "my-sink-bucket"// The Amazon Resource Name (ARN) of the AWS SNS queue to subscribe the event driven transfer to.// sqsQueueARN := "arn:aws:sqs:us-east-1:1234567891011:s3-notification-queue"// The AWS access key credential, should be accessed via environment variable for securityawsAccessKeyID:=os.Getenv("AWS_ACCESS_KEY_ID")// The AWS secret key credential, should be accessed via environment variable for securityawsSecretKey:=os.Getenv("AWS_SECRET_ACCESS_KEY")ctx:=context.Background()client,err:=storagetransfer.NewClient(ctx)iferr!=nil{returnnil,fmt.Errorf("storagetransfer.NewClient: %w",err)}deferclient.Close()req:=&storagetransferpb.CreateTransferJobRequest{TransferJob:&storagetransferpb.TransferJob{ProjectId:projectID,TransferSpec:&storagetransferpb.TransferSpec{DataSource:&storagetransferpb.TransferSpec_AwsS3DataSource{AwsS3DataSource:&storagetransferpb.AwsS3Data{BucketName:s3SourceBucket,AwsAccessKey:&storagetransferpb.AwsAccessKey{AccessKeyId:awsAccessKeyID,SecretAccessKey:awsSecretKey,}},},DataSink:&storagetransferpb.TransferSpec_GcsDataSink{GcsDataSink:&storagetransferpb.GcsData{BucketName:gcsSinkBucket}},},EventStream:&storagetransferpb.EventStream{Name:sqsQueueARN},Status:storagetransferpb.TransferJob_ENABLED,},}resp,err:=client.CreateTransferJob(ctx,req)iferr!=nil{returnnil,fmt.Errorf("failed to create transfer job: %w",err)}fmt.Fprintf(w,"Created an event driven transfer job from %v to %v subscribed to %v with name %v",s3SourceBucket,gcsSinkBucket,sqsQueueARN,resp.Name)returnresp,nil}

Java

To learn how to install and use the client library for Storage Transfer Service, seeStorage Transfer Service client libraries. For more information, see theStorage Transfer ServiceJava API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importcom.google.storagetransfer.v1.proto.StorageTransferServiceClient;importcom.google.storagetransfer.v1.proto.TransferProto;importcom.google.storagetransfer.v1.proto.TransferTypes;publicclassCreateEventDrivenAwsTransfer{publicstaticvoidmain(String[]args)throwsException{// Your Google Cloud Project IDStringprojectId="your-project-id";// The name of the source AWS bucket to transfer data fromStrings3SourceBucket="yourS3SourceBucket";// The name of the GCS bucket to transfer data toStringgcsSinkBucket="your-gcs-bucket";// The ARN of the SQS queue to subscribe toStringsqsQueueArn="arn:aws:sqs:us-east-1:1234567891011:s3-notification-queue";createEventDrivenAwsTransfer(projectId,s3SourceBucket,gcsSinkBucket,sqsQueueArn);}publicstaticvoidcreateEventDrivenAwsTransfer(StringprojectId,Strings3SourceBucket,StringgcsSinkBucket,StringsqsQueueArn)throwsException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources,// or use "try-with-close" statement to do this automatically.try(StorageTransferServiceClientstorageTransfer=StorageTransferServiceClient.create()){// The ID used to access your AWS account. Should be accessed via environment variable.StringawsAccessKeyId=System.getenv("AWS_ACCESS_KEY_ID");// The Secret Key used to access your AWS account. Should be accessed via environment// variable.StringawsSecretAccessKey=System.getenv("AWS_SECRET_ACCESS_KEY");TransferTypes.TransferJobtransferJob=TransferTypes.TransferJob.newBuilder().setProjectId(projectId).setTransferSpec(TransferTypes.TransferSpec.newBuilder().setAwsS3DataSource(TransferTypes.AwsS3Data.newBuilder().setBucketName(s3SourceBucket).setAwsAccessKey(TransferTypes.AwsAccessKey.newBuilder().setAccessKeyId(awsAccessKeyId).setSecretAccessKey(awsSecretAccessKey)).build()).setGcsDataSink(TransferTypes.GcsData.newBuilder().setBucketName(gcsSinkBucket))).setStatus(TransferTypes.TransferJob.Status.ENABLED).setEventStream(TransferTypes.EventStream.newBuilder().setName(sqsQueueArn).build()).build();TransferTypes.TransferJobresponse=storageTransfer.createTransferJob(TransferProto.CreateTransferJobRequest.newBuilder().setTransferJob(transferJob).build());System.out.println("Created a transfer job from "+s3SourceBucket+" to "+gcsSinkBucket+" subscribed to "+sqsQueueArn+" with name "+response.getName());}}}

Node.js

To learn how to install and use the client library for Storage Transfer Service, seeStorage Transfer Service client libraries. For more information, see theStorage Transfer ServiceNode.js API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

// Imports the Google Cloud client libraryconst{StorageTransferServiceClient,}=require('@google-cloud/storage-transfer');/** * TODO(developer): Uncomment the following lines before running the sample. */// The ID of the Google Cloud Platform Project that owns the job// projectId = 'my-project-id'// AWS S3 source bucket name// s3SourceBucket = 'my-s3-source-bucket'// Google Cloud Storage destination bucket name// gcsSinkBucket = 'my-gcs-destination-bucket'// The ARN of the SQS queue to subscribe to// sqsQueueArn = 'arn:aws:sqs:us-east-1:1234567891011:s3-notification-queue'// AWS Access Key ID. Should be accessed via environment variable for security.// awsAccessKeyId = 'AKIA...'// AWS Secret Access Key. Should be accessed via environment variable for security.// awsSecretAccessKey = 'HEAoMK2.../...ku8'// Creates a clientconstclient=newStorageTransferServiceClient();/** * Creates an event driven transfer that tracks an SQS queue. */asyncfunctioncreateEventDrivenAwsTransfer(){const[transferJob]=awaitclient.createTransferJob({transferJob:{projectId,status:'ENABLED',transferSpec:{awsS3DataSource:{bucketName:s3SourceBucket,awsAccessKey:{accessKeyId:awsAccessKeyId,secretAccessKey:awsSecretAccessKey,},},gcsDataSink:{bucketName:gcsSinkBucket,},},eventStream:{name:sqsQueueArn,},},});console.log(`Created an event driven transfer from '${s3SourceBucket}' to '${gcsSinkBucket}' with name${transferJob.name}`);}createEventDrivenAwsTransfer();

Python

To learn how to install and use the client library for Storage Transfer Service, seeStorage Transfer Service client libraries. For more information, see theStorage Transfer ServicePython API reference documentation.

To authenticate to Storage Transfer Service, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

fromgoogle.cloudimportstorage_transferdefcreate_event_driven_aws_transfer(project_id:str,description:str,source_s3_bucket:str,sink_gcs_bucket:str,sqs_queue_arn:str,aws_access_key_id:str,aws_secret_access_key:str,):"""Create an event driven transfer between two GCS buckets that tracks an AWS SQS queue"""client=storage_transfer.StorageTransferServiceClient()# The ID of the Google Cloud Platform Project that owns the job# project_id = 'my-project-id'# A description of this job# description = 'Creates an event-driven transfer that tracks an SQS queue'# AWS S3 source bucket name# source_s3_bucket = 'my-s3-source-bucket'# Google Cloud Storage destination bucket name# sink_gcs_bucket = 'my-gcs-destination-bucket'# The ARN of the SQS queue to subscribe to# pubsub_id = 'arn:aws:sqs:us-east-1:1234567891011:s3-notification-queue'# AWS Access Key ID. Should be accessed via environment variable for security purposes.# aws_access_key_id = 'AKIA...'# AWS Secret Access Key. Should be accessed via environment variable for security purposes.# aws_secret_access_key = 'HEAoMK2.../...ku8'transfer_job_request=storage_transfer.CreateTransferJobRequest({"transfer_job":{"project_id":project_id,"description":description,"status":storage_transfer.TransferJob.Status.ENABLED,"transfer_spec":{"aws_s3_data_source":{"bucket_name":source_s3_bucket,"aws_access_key":{"access_key_id":aws_access_key_id,"secret_access_key":aws_secret_access_key,},},"gcs_data_sink":{"bucket_name":sink_gcs_bucket,},},"event_stream":{"name":sqs_queue_arn,},},})result=client.create_transfer_job(transfer_job_request)print(f"Created transferJob:{result.name}")

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.