Streaming uploads Stay organized with collections Save and categorize content based on your preferences.
Cloud Storage supports streaming data to a bucket without requiring that thedata first be saved to a file. This is useful when you want to upload data butdon't know the final size at the start of the upload, such as when generatingthe upload data from a process, or when compressing an object on-the-fly.
Using checksum validation when streaming
Because a checksum can only be supplied in the initial request of an upload,it's often not feasible to use Cloud Storage'schecksum validationwhen streaming. It's recommended that you always use checksum validation, andyou can manually do so after a streaming upload completes; however, validatingafter the transfer completes means that any corrupted data is accessible duringthe time it takes to confirm the corruption and remove it.
If you require checksum validation prior to the upload completing and the databecoming accessible, then you shouldn't use a streaming upload. You should usea differentupload option that performs checksum validation prior tofinalizing the object.
Required roles
To get the permissions that you need to stream uploads, ask your administratorto grant you one of the following roles:
For uploads that include anObject Retention Lock, ask youradministrator to grant you the Storage Object Admin(
roles/storage.objectAdmin) IAM role for the bucket.For all other cases, ask your administrator to grant you the Storage ObjectUser (
roles/storage.objectUser) IAM role for the bucket.
These predefined roles contain the permissions required to stream uploads toCloud Storage. To see the exact permissions that arerequired, expand theRequired permissions section:
Required permissions
storage.objects.createstorage.objects.delete- This permission is only required for uploads that overwrite an existing object.
storage.objects.list- This permission is only required for using the Google Cloud CLIto perform the instructions on this page.
storage.objects.setRetention- This permission is only required for uploads that include an Object Retention Lock.
You can also get these permissions with otherpredefined roles orcustom roles.
For information about granting roles on buckets, seeSet and manage IAM policies on buckets.
Stream an upload
The following examples show how to perform a streaming upload from a process toa Cloud Storage object:
Console
The Google Cloud console does not support streaming uploads. Use thegcloud CLI instead.
Command line
Pipe the data to the
gcloud storage cpcommand and usea dash for the source URL:PROCESS_NAME | gcloud storage cp - gs://BUCKET_NAME/OBJECT_NAME
Where:
PROCESS_NAMEis the name of the processfrom which you are collecting data. For example,collect_measurements.BUCKET_NAMEis the name of the bucketcontaining the object. For example,my_app_bucket.OBJECT_NAMEis the name of the object thatis created from the data. For example,data_measurements.
Client libraries
For more information, see theCloud StorageC++ API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StorageC# API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StorageGo API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StorageJava API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StorageNode.js API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StoragePHP API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StoragePython API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries. For more information, see theCloud StorageRuby API reference documentation. To authenticate to Cloud Storage, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.C++
namespacegcs=::google::cloud::storage;using::google::cloud::StatusOr;[](gcs::Clientclient,std::stringconst&bucket_name,std::stringconst&object_name,intdesired_line_count){std::stringconsttext="Lorem ipsum dolor sit amet";gcs::ObjectWriteStreamstream=client.WriteObject(bucket_name,object_name);for(intlineno=0;lineno!=desired_line_count;++lineno){// Add 1 to the counter, because it is conventional to number lines// starting at 1.stream <<(lineno+1) <<": " <<text <<"\n";}stream.Close();StatusOr<gcs::ObjectMetadata>metadata=std::move(stream).metadata();if(!metadata)throwstd::move(metadata).status();std::cout <<"Successfully wrote to object " <<metadata->name() <<" its size is: " <<metadata->size() <<"\nFull metadata: " <<*metadata <<"\n";}C#
usingGoogle.Cloud.Storage.V1;usingSystem;usingSystem.IO;publicclassUploadFileSample{publicvoidUploadFile(stringbucketName="your-unique-bucket-name",stringlocalPath="my-local-path/my-file-name",stringobjectName="my-file-name"){varstorage=StorageClient.Create();usingvarfileStream=File.OpenRead(localPath);storage.UploadObject(bucketName,objectName,null,fileStream);Console.WriteLine($"Uploaded {objectName}.");}}Go
import("bytes""context""fmt""io""time""cloud.google.com/go/storage")// streamFileUpload uploads an object via a stream.funcstreamFileUpload(wio.Writer,bucket,objectstring)error{// bucket := "bucket-name"// object := "object-name"ctx:=context.Background()client,err:=storage.NewClient(ctx)iferr!=nil{returnfmt.Errorf("storage.NewClient: %w",err)}deferclient.Close()b:=[]byte("Hello world.")buf:=bytes.NewBuffer(b)ctx,cancel:=context.WithTimeout(ctx,time.Second*50)defercancel()// Upload an object with storage.Writer.wc:=client.Bucket(bucket).Object(object).NewWriter(ctx)wc.ChunkSize=0// note retries are not supported for chunk size 0.if_,err=io.Copy(wc,buf);err!=nil{returnfmt.Errorf("io.Copy: %w",err)}// Data can continue to be added to the file until the writer is closed.iferr:=wc.Close();err!=nil{returnfmt.Errorf("Writer.Close: %w",err)}fmt.Fprintf(w,"%v uploaded to %v.\n",object,bucket)returnnil}Java
importcom.google.cloud.WriteChannel;importcom.google.cloud.storage.BlobId;importcom.google.cloud.storage.BlobInfo;importcom.google.cloud.storage.Storage;importcom.google.cloud.storage.StorageOptions;importjava.io.IOException;importjava.nio.ByteBuffer;importjava.nio.charset.StandardCharsets;publicclassStreamObjectUpload{publicstaticvoidstreamObjectUpload(StringprojectId,StringbucketName,StringobjectName,Stringcontents)throwsIOException{// The ID of your GCP project// String projectId = "your-project-id";// The ID of your GCS bucket// String bucketName = "your-unique-bucket-name";// The ID of your GCS object// String objectName = "your-object-name";// The string of contents you wish to upload// String contents = "Hello world!";Storagestorage=StorageOptions.newBuilder().setProjectId(projectId).build().getService();BlobIdblobId=BlobId.of(bucketName,objectName);BlobInfoblobInfo=BlobInfo.newBuilder(blobId).build();byte[]content=contents.getBytes(StandardCharsets.UTF_8);try(WriteChannelwriter=storage.writer(blobInfo)){writer.write(ByteBuffer.wrap(content));System.out.println("Wrote to "+objectName+" in bucket "+bucketName+" using a WriteChannel.");}}}Node.js
/** * TODO(developer): Uncomment the following lines before running the sample */// The ID of your GCS bucket// const bucketName = 'your-unique-bucket-name';// The new ID for your GCS file// const destFileName = 'your-new-file-name';// The content to be uploaded in the GCS file// const contents = 'your file content';// Imports the Google Cloud client libraryconst{Storage}=require('@google-cloud/storage');// Import Node.js streamconststream=require('stream');// Creates a clientconststorage=newStorage();// Get a reference to the bucketconstmyBucket=storage.bucket(bucketName);// Create a reference to a file objectconstfile=myBucket.file(destFileName);// Create a pass through stream from a stringconstpassthroughStream=newstream.PassThrough();passthroughStream.write(contents);passthroughStream.end();asyncfunctionstreamFileUpload(){passthroughStream.pipe(file.createWriteStream()).on('finish',()=>{// The file upload is complete});console.log(`${destFileName} uploaded to${bucketName}`);}streamFileUpload().catch(console.error);PHP
use Google\Cloud\Storage\StorageClient;use Google\Cloud\Storage\WriteStream;/** * Upload a chunked file stream. * * @param string $bucketName The name of your Cloud Storage bucket. * (e.g. 'my-bucket') * @param string $objectName The name of your Cloud Storage object. * (e.g. 'my-object') * @param string $contents The contents to upload via stream chunks. * (e.g. 'these are my contents') */function upload_object_stream(string $bucketName, string $objectName, string $contents): void{ $storage = new StorageClient(); $bucket = $storage->bucket($bucketName); $writeStream = new WriteStream(null, [ 'chunkSize' => 1024 * 256, // 256KB ]); $uploader = $bucket->getStreamableUploader($writeStream, [ 'name' => $objectName, ]); $writeStream->setUploader($uploader); $stream = fopen('data://text/plain,' . $contents, 'r'); while (($line = stream_get_line($stream, 1024 * 256)) !== false) { $writeStream->write($line); } $writeStream->close(); printf('Uploaded %s to gs://%s/%s' . PHP_EOL, $contents, $bucketName, $objectName);}Python
fromgoogle.cloudimportstoragedefupload_blob_from_stream(bucket_name,file_obj,destination_blob_name):"""Uploads bytes from a stream or other file-like object to a blob."""# The ID of your GCS bucket# bucket_name = "your-bucket-name"# The stream or file (file-like object) from which to read# import io# file_obj = io.BytesIO()# file_obj.write(b"This is test data.")# The desired name of the uploaded GCS object (blob)# destination_blob_name = "storage-object-name"# Construct a client-side representation of the blob.storage_client=storage.Client()bucket=storage_client.bucket(bucket_name)blob=bucket.blob(destination_blob_name)# Rewind the stream to the beginning. This step can be omitted if the input# stream will always be at a correct position.file_obj.seek(0)# Upload data from the stream to your bucket.blob.upload_from_file(file_obj)print(f"Stream data uploaded to{destination_blob_name} in bucket{bucket_name}.")Ruby
# The ID of your GCS bucket# bucket_name = "your-unique-bucket-name"# The stream or file (file-like object) from which to read# local_file_obj = StringIO.new "This is test data."# Name of a file in the Storage bucket# file_name = "some_file.txt"require"google/cloud/storage"storage=Google::Cloud::Storage.newbucket=storage.bucketbucket_namelocal_file_obj.rewindbucket.create_filelocal_file_obj,file_nameputs"Stream data uploaded to#{file_name} in bucket#{bucket_name}"
REST APIs
JSON API
To perform a streaming upload, use one of the following methods:
Aresumable upload, with the following adjustments:
When uploading the file data itself, use amultiple chunk upload.
Since you don't know the total file size until you get to thefinal chunk, use a
*for the total file size in theContent-Rangeheader of intermediate chunks.For example, if the first chunk you upload has a size of 512 KiB,the
Content-Rangeheader for the chunk isbytes 0-524287/*. Ifyour upload has 64000 bytes remaining after the first chunk, youthen send a final chunk that contains the remaining bytes and hasaContent-Rangeheader with the valuebytes 524288-588287/588288.
Asingle-request upload, with the following adjustments:
Include a
Transfer-Encoding: chunkedheader, and excludetheContent-Lengthheader.Construct the request according to thespecification,sending the object data in chunks as it becomes available.
XML API
To perform a streaming upload, use one of the following methods:
Aresumable upload, with the following adjustments:
When uploading the file data itself, use amultiple chunk upload.
Since you don't know the total file size until you get to thefinal chunk, use a
*for the total file size in theContent-Rangeheader of intermediate chunks.For example, if the first chunk you upload has a size of 512 KiB,the
Content-Rangeheader for the chunk isbytes 0-524287/*. Ifyour upload has 64000 bytes remaining after the first chunk, youthen send a final chunk that contains the remaining bytes and hasaContent-Rangeheader with the valuebytes 524288-588287/588288.
Asingle-request upload, with the following adjustments:
Include a
Transfer-Encoding: chunkedheader, and excludetheContent-Lengthheader.Construct the request according to thespecification,sending the object data in chunks as it becomes available.
Note that you cannot perform a streaming upload using this methodif the request uses asignature in its
Authorizationheader.
What's next
- Stream a download.
- Learn more aboutdecompressive transcoding.
- Learn more aboutuploads and downloads.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.