Process images from Cloud Storage tutorial

This tutorial demonstrates using Cloud Run, Cloud Vision API, andImageMagick to detectand blur offensive images uploaded to a Cloud Storage bucket. Thistutorial builds on the tutorialUse Pub/Sub with Cloud Run.

This tutorial walks through modifying an existing sample app. You can alsodownload the completed sample if you want.

Objectives

  • Write, build, and deploy an asynchronous data processing service to Cloud Run.
  • Invoke the service by uploading a file to Cloud Storage, creating a Pub/Sub message.
  • Use the Cloud Vision API to detect violent or adult content.
  • Use ImageMagick to blur offensive images.
  • Test the service by uploading an image of a flesh-eating zombie.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use thepricing calculator.

New Google Cloud users might be eligible for afree trial.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  3. Verify that billing is enabled for your Google Cloud project.

  4. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  5. Verify that billing is enabled for your Google Cloud project.

  6. Enable the Artifact Registry, Cloud Build, Pub/Sub,Cloud Run, Cloud Storage and Cloud Vision APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

  7. Install and initialize the gcloud CLI.
  8. Update components:
    gcloudcomponentsupdate
  9. Set up a Pub/Sub topic, a secure push subscription, and an initial Cloud Run service to handle messages by following theUse Pub/Sub tutorial

Required roles

To get the permissions that you need to complete the tutorial, ask your administrator to grant you the following IAM roles on your project:

For more information about granting roles, seeManage access to projects, folders, and organizations.

You might also be able to get the required permissions throughcustom roles or otherpredefined roles.

Note:IAM basic roles might also contain permissions to complete the tutorial. You shouldn't grant basic roles in a production environment, but you can grant them in a development or test environment.

Setting up gcloud defaults

To configure gcloud with defaults for your Cloud Run service:

  1. Set your default project:

    gcloudconfigsetprojectPROJECT_ID

    ReplacePROJECT_ID with the name of the project you created forthis tutorial.

  2. Configure gcloud for your chosen region:

    gcloudconfigsetrun/regionREGION

    ReplaceREGION with the supported Cloud Runregionof your choice.

Cloud Run locations

Cloud Run is regional, which means the infrastructure thatruns your Cloud Run services is located in a specific region and ismanaged by Google to be redundantly available acrossall the zones within that region.

Meeting your latency, availability, or durability requirements are primaryfactors for selecting the region where your Cloud Run services are run.You can generally select the region nearest to your users but you should considerthe location of theother Google Cloudproducts that are used by your Cloud Run service.Using Google Cloud products together across multiple locations can affectyour service's latency as well as cost.

Cloud Run is available in the following regions:

Subject toTier 1 pricing

  • asia-east1 (Taiwan)
  • asia-northeast1 (Tokyo)
  • asia-northeast2 (Osaka)
  • asia-south1 (Mumbai, India)
  • europe-north1 (Finland)leaf iconLow CO2
  • europe-north2 (Stockholm)leaf iconLow CO2
  • europe-southwest1 (Madrid)leaf iconLow CO2
  • europe-west1 (Belgium)leaf iconLow CO2
  • europe-west4 (Netherlands)leaf iconLow CO2
  • europe-west8 (Milan)
  • europe-west9 (Paris)leaf iconLow CO2
  • me-west1 (Tel Aviv)
  • northamerica-south1 (Mexico)
  • us-central1 (Iowa)leaf iconLow CO2
  • us-east1 (South Carolina)
  • us-east4 (Northern Virginia)
  • us-east5 (Columbus)
  • us-south1 (Dallas)leaf iconLow CO2
  • us-west1 (Oregon)leaf iconLow CO2

Subject toTier 2 pricing

  • africa-south1 (Johannesburg)
  • asia-east2 (Hong Kong)
  • asia-northeast3 (Seoul, South Korea)
  • asia-southeast1 (Singapore)
  • asia-southeast2 (Jakarta)
  • asia-south2 (Delhi, India)
  • australia-southeast1 (Sydney)
  • australia-southeast2 (Melbourne)
  • europe-central2 (Warsaw, Poland)
  • europe-west10 (Berlin)
  • europe-west12 (Turin)
  • europe-west2 (London, UK)leaf iconLow CO2
  • europe-west3 (Frankfurt, Germany)
  • europe-west6 (Zurich, Switzerland)leaf iconLow CO2
  • me-central1 (Doha)
  • me-central2 (Dammam)
  • northamerica-northeast1 (Montreal)leaf iconLow CO2
  • northamerica-northeast2 (Toronto)leaf iconLow CO2
  • southamerica-east1 (Sao Paulo, Brazil)leaf iconLow CO2
  • southamerica-west1 (Santiago, Chile)leaf iconLow CO2
  • us-west2 (Los Angeles)
  • us-west3 (Salt Lake City)
  • us-west4 (Las Vegas)

If you already created a Cloud Run service, you can view theregion in the Cloud Run dashboard in theGoogle Cloud console.

Understanding the sequence of operations

The flow of data in this tutorial follows these steps:

  1. A user uploads an image to a Cloud Storage bucket.
  2. Cloud Storage publishes a message about the new file to Pub/Sub.
  3. Pub/Sub pushes the message to the Cloud Runservice.
  4. The Cloud Run service retrieves the image file referenced in thePub/Sub message.
  5. The Cloud Run service uses the Cloud Vision API to analyze the image.
  6. If violent or adult content is detected, the Cloud Run service usesImageMagick to blur the image.
  7. The Cloud Run service uploads the blurred image to anotherCloud Storage bucket for use.

Subsequent use of the blurred image is left as an exercise for the reader.

Create an Artifact Registry standard repository

Create an Artifact Registry standard repository to store your container image:

gcloudartifactsrepositoriescreateREPOSITORY\--repository-format=docker\--location=REGION

Replace:

  • REPOSITORY with a unique name for the repository.
  • REGION with the Google Cloud region to be used for the Artifact Registry repository.

Set up Cloud Storage buckets

gcloud

  1. Create a Cloud Storage bucket for uploading images, whereINPUT_BUCKET_NAME is a globally unique bucket name:

    gcloudstoragebucketscreategs://INPUT_BUCKET_NAME

    The Cloud Run service only reads from this bucket.

  2. Create a second Cloud Storage bucket to receive blurred images, whereBLURRED_BUCKET_NAME is a globally unique bucket name:

    gcloudstoragebucketscreategs://BLURRED_BUCKET_NAME

    The Cloud Run service uploads blurred images to this bucket. Usinga separate bucket prevents processed images from re-triggering the service.

    By default, Cloud Run revisions execute as theCompute Engine default service account.

    If, instead, you are using auser-managed service account,ensure that you have assigned the requiredIAM roles so thatit hasstorage.objects.get permission for reading fromINPUT_BUCKET_NAME andstorage.objects.create permission foruploading toBLURRED_BUCKET_NAME.

Terraform

To learn how to apply or remove a Terraform configuration, seeBasic Terraform commands.

Create two Cloud Storage buckets: one for uploading original imagesand another for the Cloud Run service to upload blurred images.

To create both Cloud Storage buckets with globally unique names, addthe following to your existingmain.tf file:

resource"random_id""bucket_suffix"{byte_length=8}resource"google_storage_bucket""imageproc_input"{name="input-bucket-${random_id.bucket_suffix.hex}"location="us-central1"}output"input_bucket_name"{value=google_storage_bucket.imageproc_input.name}resource"google_storage_bucket""imageproc_output"{name="output-bucket-${random_id.bucket_suffix.hex}"location="us-central1"}output"blurred_bucket_name"{value=google_storage_bucket.imageproc_output.name}

By default, Cloud Run revisions execute as theCompute Engine default serviceaccount.

If, instead, you are using auser-managed service account, ensurethat you have assigned the requiredIAM rolesso that it hasstorage.objects.get permission for reading fromgoogle_storage_bucket.imageproc_input andstorage.objects.create permission for uploading togoogle_storage_bucket.imageproc_output.

In the following steps, you create and deploy a service that processesnotification of file uploads to theINPUT_BUCKET_NAME. You turn onnotification delivery after you deploy and test the service, to avoid prematureinvocation of the new service.

Modify the Pub/Sub tutorial sample code

This tutorial builds on the code assembled in theUse Pub/Sub tutorial.If you have not yet completed that tutorial, do so now, skipping the cleanupsteps, then return here to add image processing behavior.

Add image processing code

The image processing code is separated from request handling for readability andease of testing. To add image processing code:

  1. Change to the directory of the Pub/Sub tutorial sample code.

  2. Add code to import the image processing dependencies, including libraries tointegrate with Google Cloud services, ImageMagick, and the file system.

    Node.js

    Open a newimage.js file in your editor, and copy in the following:
    constgm=require('gm').subClass({imageMagick:true});constfs=require('fs');const{promisify}=require('util');constpath=require('path');constvision=require('@google-cloud/vision');const{Storage}=require('@google-cloud/storage');conststorage=newStorage();constclient=newvision.ImageAnnotatorClient();const{BLURRED_BUCKET_NAME}=process.env;

    Python

    Open a newimage.py file in your editor, and copy in the following:
    importosimporttempfilefromgoogle.cloudimportstorage,visionfromwand.imageimportImagestorage_client=storage.Client()vision_client=vision.ImageAnnotatorClient()

    Go

    Open a newimagemagick/imagemagick.go file in your editor, and copy in thefollowing:
    // Package imagemagick contains an example of using ImageMagick to process a// file uploaded to Cloud Storage.packageimagemagickimport("context""errors""fmt""log""os""os/exec""cloud.google.com/go/storage"vision"cloud.google.com/go/vision/apiv1""cloud.google.com/go/vision/v2/apiv1/visionpb")// Global API clients used across function invocations.var(storageClient*storage.ClientvisionClient*vision.ImageAnnotatorClient)funcinit(){// Declare a separate err variable to avoid shadowing the client variables.varerrerrorstorageClient,err=storage.NewClient(context.Background())iferr!=nil{log.Fatalf("storage.NewClient: %v",err)}visionClient,err=vision.NewImageAnnotatorClient(context.Background())iferr!=nil{log.Fatalf("vision.NewAnnotatorClient: %v",err)}}

    Java

    Open a newsrc/main/java/com/example/cloudrun/ImageMagick.java file inyour editor, and copy in the following:
    importcom.google.cloud.storage.Blob;importcom.google.cloud.storage.BlobId;importcom.google.cloud.storage.BlobInfo;importcom.google.cloud.storage.Storage;importcom.google.cloud.storage.StorageOptions;importcom.google.cloud.vision.v1.AnnotateImageRequest;importcom.google.cloud.vision.v1.AnnotateImageResponse;importcom.google.cloud.vision.v1.BatchAnnotateImagesResponse;importcom.google.cloud.vision.v1.Feature;importcom.google.cloud.vision.v1.Feature.Type;importcom.google.cloud.vision.v1.Image;importcom.google.cloud.vision.v1.ImageAnnotatorClient;importcom.google.cloud.vision.v1.ImageSource;importcom.google.cloud.vision.v1.SafeSearchAnnotation;importcom.google.gson.JsonObject;importjava.io.IOException;importjava.nio.file.Files;importjava.nio.file.Path;importjava.nio.file.Paths;importjava.util.ArrayList;importjava.util.List;publicclassImageMagick{privatestaticfinalStringBLURRED_BUCKET_NAME=System.getenv("BLURRED_BUCKET_NAME");privatestaticStoragestorage=StorageOptions.getDefaultInstance().getService();

  3. Add code to receives a Pub/Sub message as an event object and control theimage processing.

    The event contains data about the originally uploaded image. This codedetermines if the image needs be blurred by checking the results of aCloud Vision analysis for violent or adult content.

    Node.js

    // Blurs uploaded images that are flagged as Adult or Violence.exports.blurOffensiveImages=asyncevent=>{// This event represents the triggering Cloud Storage object.constobject=event;constfile=storage.bucket(object.bucket).file(object.name);constfilePath=`gs://${object.bucket}/${object.name}`;console.log(`Analyzing${file.name}.`);try{const[result]=awaitclient.safeSearchDetection(filePath);constdetections=result.safeSearchAnnotation||{};if(// Levels are defined in https://cloud.google.com/vision/docs/reference/rest/v1/AnnotateImageResponse#likelihooddetections.adult==='VERY_LIKELY'||detections.violence==='VERY_LIKELY'){console.log(`Detected${file.name} as inappropriate.`);returnblurImage(file,BLURRED_BUCKET_NAME);}else{console.log(`Detected${file.name} as OK.`);}}catch(err){console.error(`Failed to analyze${file.name}.`,err);throwerr;}};

    Python

    defblur_offensive_images(data):"""Blurs uploaded images that are flagged as Adult or Violence.    Args:        data: Pub/Sub message data    """file_data=datafile_name=file_data["name"]bucket_name=file_data["bucket"]blob=storage_client.bucket(bucket_name).get_blob(file_name)blob_uri=f"gs://{bucket_name}/{file_name}"blob_source=vision.Image(source=vision.ImageSource(image_uri=blob_uri))# Ignore already-blurred filesiffile_name.startswith("blurred-"):print(f"The image{file_name} is already blurred.")returnprint(f"Analyzing{file_name}.")result=vision_client.safe_search_detection(image=blob_source)detected=result.safe_search_annotation# Process imageifdetected.adult==5ordetected.violence==5:print(f"The image{file_name} was detected as inappropriate.")return__blur_image(blob)else:print(f"The image{file_name} was detected as OK.")

    Go

    // GCSEvent is the payload of a GCS event.typeGCSEventstruct{Bucketstring`json:"bucket"`Namestring`json:"name"`}// BlurOffensiveImages blurs offensive images uploaded to GCS.funcBlurOffensiveImages(ctxcontext.Context,eGCSEvent)error{outputBucket:=os.Getenv("BLURRED_BUCKET_NAME")ifoutputBucket==""{returnerrors.New("BLURRED_BUCKET_NAME must be set")}img:=vision.NewImageFromURI(fmt.Sprintf("gs://%s/%s",e.Bucket,e.Name))resp,err:=visionClient.DetectSafeSearch(ctx,img,nil)iferr!=nil{returnfmt.Errorf("AnnotateImage: %w",err)}ifresp.GetAdult()==visionpb.Likelihood_VERY_LIKELY||resp.GetViolence()==visionpb.Likelihood_VERY_LIKELY{returnblur(ctx,e.Bucket,outputBucket,e.Name)}log.Printf("The image %q was detected as OK.",e.Name)returnnil}

    Java

    // Blurs uploaded images that are flagged as Adult or Violence.publicstaticvoidblurOffensiveImages(JsonObjectdata){StringfileName=data.get("name").getAsString();StringbucketName=data.get("bucket").getAsString();BlobInfoblobInfo=BlobInfo.newBuilder(bucketName,fileName).build();// Construct URI to GCS bucket and file.StringgcsPath=String.format("gs://%s/%s",bucketName,fileName);System.out.println(String.format("Analyzing %s",fileName));// Construct request.List<AnnotateImageRequest>requests=newArrayList<>();ImageSourceimgSource=ImageSource.newBuilder().setImageUri(gcsPath).build();Imageimg=Image.newBuilder().setSource(imgSource).build();Featurefeature=Feature.newBuilder().setType(Type.SAFE_SEARCH_DETECTION).build();AnnotateImageRequestrequest=AnnotateImageRequest.newBuilder().addFeatures(feature).setImage(img).build();requests.add(request);// Send request to the Vision API.try(ImageAnnotatorClientclient=ImageAnnotatorClient.create()){BatchAnnotateImagesResponseresponse=client.batchAnnotateImages(requests);List<AnnotateImageResponse>responses=response.getResponsesList();for(AnnotateImageResponseres:responses){if(res.hasError()){System.out.println(String.format("Error: %s\n",res.getError().getMessage()));return;}// Get Safe Search AnnotationsSafeSearchAnnotationannotation=res.getSafeSearchAnnotation();if(annotation.getAdultValue()==5||annotation.getViolenceValue()==5){System.out.println(String.format("Detected %s as inappropriate.",fileName));blur(blobInfo);}else{System.out.println(String.format("Detected %s as OK.",fileName));}}}catch(Exceptione){System.out.println(String.format("Error with Vision API: %s",e.getMessage()));}}

  4. Retrieve the referenced image from the Cloud Storage input bucketcreated above, use ImageMagick to transform the image with a blur effect, andupload the result to the output bucket.

    Node.js

    // Blurs the given file using ImageMagick, and uploads it to another bucket.constblurImage=async(file,blurredBucketName)=>{consttempLocalPath=`/tmp/${path.parse(file.name).base}`;// Download file from bucket.try{awaitfile.download({destination:tempLocalPath});console.log(`Downloaded${file.name} to${tempLocalPath}.`);}catch(err){thrownewError(`File download failed:${err}`);}awaitnewPromise((resolve,reject)=>{gm(tempLocalPath).blur(0,16).write(tempLocalPath,(err,stdout)=>{if(err){console.error('Failed to blur image.',err);reject(err);}else{console.log(`Blurred image:${file.name}`);resolve(stdout);}});});// Upload result to a different bucket, to avoid re-triggering this function.constblurredBucket=storage.bucket(blurredBucketName);// Upload the Blurred image back into the bucket.constgcsPath=`gs://${blurredBucketName}/${file.name}`;try{awaitblurredBucket.upload(tempLocalPath,{destination:file.name});console.log(`Uploaded blurred image to:${gcsPath}`);}catch(err){thrownewError(`Unable to upload blurred image to${gcsPath}:${err}`);}// Delete the temporary file.constunlink=promisify(fs.unlink);returnunlink(tempLocalPath);};

    Python

    def__blur_image(current_blob):"""Blurs the given file using ImageMagick.    Args:        current_blob: a Cloud Storage blob    """file_name=current_blob.name_,temp_local_filename=tempfile.mkstemp()# Download file from bucket.current_blob.download_to_filename(temp_local_filename)print(f"Image{file_name} was downloaded to{temp_local_filename}.")# Blur the image using ImageMagick.withImage(filename=temp_local_filename)asimage:image.resize(*image.size,blur=16,filter="hamming")image.save(filename=temp_local_filename)print(f"Image{file_name} was blurred.")# Upload result to a second bucket, to avoid re-triggering the function.# You could instead re-upload it to the same bucket + tell your function# to ignore files marked as blurred (e.g. those with a "blurred" prefix)blur_bucket_name=os.getenv("BLURRED_BUCKET_NAME")blur_bucket=storage_client.bucket(blur_bucket_name)new_blob=blur_bucket.blob(file_name)new_blob.upload_from_filename(temp_local_filename)print(f"Blurred image uploaded to: gs://{blur_bucket_name}/{file_name}")# Delete the temporary file.os.remove(temp_local_filename)

    Go

    // blur blurs the image stored at gs://inputBucket/name and stores the result in// gs://outputBucket/name.funcblur(ctxcontext.Context,inputBucket,outputBucket,namestring)error{inputBlob:=storageClient.Bucket(inputBucket).Object(name)r,err:=inputBlob.NewReader(ctx)iferr!=nil{returnfmt.Errorf("NewReader: %w",err)}outputBlob:=storageClient.Bucket(outputBucket).Object(name)w:=outputBlob.NewWriter(ctx)deferw.Close()// Use - as input and output to use stdin and stdout.cmd:=exec.Command("convert","-","-blur","0x8","-")cmd.Stdin=rcmd.Stdout=wiferr:=cmd.Run();err!=nil{returnfmt.Errorf("cmd.Run: %w",err)}log.Printf("Blurred image uploaded to gs://%s/%s",outputBlob.BucketName(),outputBlob.ObjectName())returnnil}

    Java

    // Blurs the file described by blobInfo using ImageMagick,// and uploads it to the blurred bucket.publicstaticvoidblur(BlobInfoblobInfo)throwsIOException{StringbucketName=blobInfo.getBucket();StringfileName=blobInfo.getName();// Download imageBlobblob=storage.get(BlobId.of(bucketName,fileName));Pathdownload=Paths.get("/tmp/",fileName);blob.downloadTo(download);// Construct the command.List<String>args=newArrayList<>();args.add("convert");args.add(download.toString());args.add("-blur");args.add("0x8");Pathupload=Paths.get("/tmp/","blurred-"+fileName);args.add(upload.toString());try{ProcessBuilderpb=newProcessBuilder(args);Processprocess=pb.start();process.waitFor();}catch(Exceptione){System.out.println(String.format("Error: %s",e.getMessage()));}// Upload image to blurred bucket.BlobIdblurredBlobId=BlobId.of(BLURRED_BUCKET_NAME,fileName);BlobInfoblurredBlobInfo=BlobInfo.newBuilder(blurredBlobId).setContentType(blob.getContentType()).build();try{byte[]blurredFile=Files.readAllBytes(upload);BlobblurredBlob=storage.create(blurredBlobInfo,blurredFile);System.out.println(String.format("Blurred image uploaded to: gs://%s/%s",BLURRED_BUCKET_NAME,fileName));}catch(Exceptione){System.out.println(String.format("Error in upload: %s",e.getMessage()));}// Remove images from fileSystemFiles.delete(download);Files.delete(upload);}}

Integrate image processing into the Pub/Sub sample code

To modify the existing service to incorporate the image processing code:

  1. Add new dependencies for your service, including the Cloud Vision andCloud Storage client libraries:

    Node.js

    npm install gm @google-cloud/storage @google-cloud/vision

    Python

    Add the necessary client libraries so that yourrequirements.txt will look something like this:
    Flask==3.0.3google-cloud-storage==2.12.0google-cloud-vision==3.8.1gunicorn==23.0.0Wand==0.6.13Werkzeug==3.0.3

    Go

    The go sample application usesgo modules,the new dependencies added above in theimagemagick/imagemagick.go importstatement will automatically download by the next command that needs them.

    Java

    Add the following dependency under<dependencyManagement> in thepom.xml:
    <dependency><groupId>com.google.cloud</groupId><artifactId>spring-cloud-gcp-dependencies</artifactId><version>4.9.2</version><type>pom</type><scope>import</scope></dependency>
    Add the following dependencies under<dependencies> in thepom.xml:
    <dependency><groupId>com.google.code.gson</groupId><artifactId>gson</artifactId><scope>compile</scope></dependency><dependency><groupId>com.google.cloud</groupId><artifactId>spring-cloud-gcp-starter-vision</artifactId></dependency><dependency><groupId>com.google.cloud</groupId><artifactId>spring-cloud-gcp-starter-storage</artifactId></dependency>

  2. Add the ImageMagick system package to your container by modifying theDockerfile below theFROM statement. If using a "multi-stage" Dockerfile,place this in the final stage.

    Debian/Ubuntu
    #InstallImagemagickintothecontainerimage.#Formoreonsystempackagesreviewthesystempackagestutorial.#https://cloud.google.com/run/docs/tutorials/system-packages#dockerfileRUNset-ex;\apt-get-yupdate;\apt-get-yinstallimagemagick;\rm-rf/var/lib/apt/lists/*
    Alpine
    #InstallImagemagickintothecontainerimage.#Formoreonsystempackagesreviewthesystempackagestutorial.#https://cloud.google.com/run/docs/tutorials/system-packages#dockerfileRUNapkadd--no-cacheimagemagick

    Read more about working with system packages in your Cloud Run servicein theUsing system packages tutorial.

  3. Replace the existing Pub/Sub message handling code with a function call to our new blurring logic.

    Node.js

    Theapp.js file defines the Express.js app and prepares received Pub/Sub messages for use. Make the following changes:

    • Add code to import the newimage.js file
    • Remove the existing "Hello World" code from the route
    • Add code to further validate the Pub/Sub message
    • Add code to call the new image processing function

      When you are finished, the code will look like this:

    constexpress=require('express');constapp=express();// This middleware is available in Express v4.16.0 onwardsapp.use(express.json());constimage=require('./image');app.post('/',async(req,res)=>{if(!req.body){constmsg='no Pub/Sub message received';console.error(`error:${msg}`);res.status(400).send(`Bad Request:${msg}`);return;}if(!req.body.message||!req.body.message.data){constmsg='invalid Pub/Sub message format';console.error(`error:${msg}`);res.status(400).send(`Bad Request:${msg}`);return;}// Decode the Pub/Sub message.constpubSubMessage=req.body.message;letdata;try{data=Buffer.from(pubSubMessage.data,'base64').toString().trim();data=JSON.parse(data);}catch(err){constmsg='Invalid Pub/Sub message: data property is not valid base64 encoded JSON';console.error(`error:${msg}:${err}`);res.status(400).send(`Bad Request:${msg}`);return;}// Validate the message is a Cloud Storage event.if(!data.name||!data.bucket){constmsg='invalid Cloud Storage notification: expected name and bucket properties';console.error(`error:${msg}`);res.status(400).send(`Bad Request:${msg}`);return;}try{awaitimage.blurOffensiveImages(data);res.status(204).send();}catch(err){console.error(`error: Blurring image:${err}`);res.status(500).send();}});

    Python

    Themain.py file defines the Flask app and prepares received Pub/Sub messages for use. Make the following changes:

    • Add code to import the newimage.py file
    • Remove the existing "Hello World" code from the route
    • Add code to further validate the Pub/Sub message
    • Add code to call the new image processing function

      When you are finished, the code will look like this:

    importbase64importjsonimportosfromflaskimportFlask,requestimportimageapp=Flask(__name__)@app.route("/",methods=["POST"])defindex():"""Receive and parse Pub/Sub messages containing Cloud Storage event data."""envelope=request.get_json()ifnotenvelope:msg="no Pub/Sub message received"print(f"error:{msg}")returnf"Bad Request:{msg}",400ifnotisinstance(envelope,dict)or"message"notinenvelope:msg="invalid Pub/Sub message format"print(f"error:{msg}")returnf"Bad Request:{msg}",400# Decode the Pub/Sub message.pubsub_message=envelope["message"]ifisinstance(pubsub_message,dict)and"data"inpubsub_message:try:data=json.loads(base64.b64decode(pubsub_message["data"]).decode())exceptExceptionase:msg=("Invalid Pub/Sub message: ""data property is not valid base64 encoded JSON")print(f"error:{e}")returnf"Bad Request:{msg}",400# Validate the message is a Cloud Storage event.ifnotdata["name"]ornotdata["bucket"]:msg=("Invalid Cloud Storage notification: ""expected name and bucket properties")print(f"error:{msg}")returnf"Bad Request:{msg}",400try:image.blur_offensive_images(data)return("",204)exceptExceptionase:print(f"error:{e}")return("",500)return("",500)

    Go

    Themain.go file defines the HTTP service and prepares received Pub/Sub messages for use. Make the following changes:

    • Add code to import the newimagemagick.go file
    • Remove the existing "Hello World" code from the handler
    • Add code to further validate the Pub/Sub message
    • Add code to call the new image processing function

    // Sample image-processing is a Cloud Run service which performs asynchronous processing on images.packagemainimport("encoding/json""io""log""net/http""os""github.com/GoogleCloudPlatform/golang-samples/run/image-processing/imagemagick")funcmain(){http.HandleFunc("/",HelloPubSub)// Determine port for HTTP service.port:=os.Getenv("PORT")ifport==""{port="8080"}// Start HTTP server.log.Printf("Listening on port %s",port)iferr:=http.ListenAndServe(":"+port,nil);err!=nil{log.Fatal(err)}}// PubSubMessage is the payload of a Pub/Sub event.// See the documentation for more details:// https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessagetypePubSubMessagestruct{Messagestruct{Data[]byte`json:"data,omitempty"`IDstring`json:"id"`}`json:"message"`Subscriptionstring`json:"subscription"`}// HelloPubSub receives and processes a Pub/Sub push message.funcHelloPubSub(whttp.ResponseWriter,r*http.Request){varmPubSubMessagebody,err:=io.ReadAll(r.Body)iferr!=nil{log.Printf("ioutil.ReadAll: %v",err)http.Error(w,"Bad Request",http.StatusBadRequest)return}iferr:=json.Unmarshal(body,&m);err!=nil{log.Printf("json.Unmarshal: %v",err)http.Error(w,"Bad Request",http.StatusBadRequest)return}vareimagemagick.GCSEventiferr:=json.Unmarshal(m.Message.Data,&e);err!=nil{log.Printf("json.Unmarshal: %v",err)http.Error(w,"Bad Request",http.StatusBadRequest)return}ife.Name==""||e.Bucket==""{log.Printf("invalid GCSEvent: expected name and bucket")http.Error(w,"Bad Request",http.StatusBadRequest)return}iferr:=imagemagick.BlurOffensiveImages(r.Context(),e);err!=nil{log.Printf("imagemagick.BlurOffensiveImages: %v",err)http.Error(w,"Internal Server Error",http.StatusInternalServerError)}}

    Java

    ThePubSubController.java file defines the controller that handles HTTP requests and prepares received Pub/Sub messages for use. Make the following changes:

    • Add the new imports
    • Remove the existing "Hello World" code from the controller
    • Add code to further validate the Pub/Sub message
    • Add code to call the new image processing function

    importcom.google.gson.JsonObject;importcom.google.gson.JsonParser;importjava.util.Base64;importorg.springframework.http.HttpStatus;importorg.springframework.http.ResponseEntity;importorg.springframework.web.bind.annotation.RequestBody;importorg.springframework.web.bind.annotation.RequestMapping;importorg.springframework.web.bind.annotation.RequestMethod;importorg.springframework.web.bind.annotation.RestController;// PubsubController consumes a Pub/Sub message.@RestControllerpublicclassPubSubController{@RequestMapping(value="/",method=RequestMethod.POST)publicResponseEntity<String>receiveMessage(@RequestBodyBodybody){// Get PubSub message from request body.Body.Messagemessage=body.getMessage();if(message==null){Stringmsg="Bad Request: invalid Pub/Sub message format";System.out.println(msg);returnnewResponseEntity<>(msg,HttpStatus.BAD_REQUEST);}// Decode the Pub/Sub message.StringpubSubMessage=message.getData();JsonObjectdata;try{StringdecodedMessage=newString(Base64.getDecoder().decode(pubSubMessage));data=JsonParser.parseString(decodedMessage).getAsJsonObject();}catch(Exceptione){Stringmsg="Error: Invalid Pub/Sub message: data property is not valid base64 encoded JSON";System.out.println(msg);returnnewResponseEntity<>(msg,HttpStatus.BAD_REQUEST);}// Validate the message is a Cloud Storage event.if(data.get("name")==null||data.get("bucket")==null){Stringmsg="Error: Invalid Cloud Storage notification: expected name and bucket properties";System.out.println(msg);returnnewResponseEntity<>(msg,HttpStatus.BAD_REQUEST);}try{ImageMagick.blurOffensiveImages(data);}catch(Exceptione){Stringmsg=String.format("Error: Blurring image: %s",e.getMessage());System.out.println(msg);returnnewResponseEntity<>(msg,HttpStatus.INTERNAL_SERVER_ERROR);}returnnewResponseEntity<>(HttpStatus.OK);}}

Download the complete sample

To retrieve the complete Image Processing code sample for use:

  1. Clone the sample app repository to your local machine:

    Node.js

    gitclonehttps://github.com/GoogleCloudPlatform/nodejs-docs-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

    Python

    gitclonehttps://github.com/GoogleCloudPlatform/python-docs-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

    Go

    gitclonehttps://github.com/GoogleCloudPlatform/golang-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

    Java

    gitclonehttps://github.com/GoogleCloudPlatform/java-docs-samples.git

    Alternatively, you can download the sample as a zip file and extract it.

  2. Change to the directory that contains the Cloud Run sample code:

    Node.js

    cdnodejs-docs-samples/run/image-processing/

    Python

    cdpython-docs-samples/run/image-processing/

    Go

    cdgolang-samples/run/image-processing/

    Java

    cdjava-docs-samples/run/image-processing/

Ship the code

Shipping code consists of three steps: building a container image withCloud Build, uploading the container image to Artifact Registry, anddeploying the container image to Cloud Run.

To ship your code:

  1. Build your container and publish on Artifact Registry:

    Node.js

    gcloudbuildssubmit--tagREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub

    Wherepubsub is the name of your service.

    Replace:

    • PROJECT_ID with your Google Cloud project ID
    • REPOSITORY with the name of the Artifact Registry repository.
    • REGION with the Google Cloud region to be used for the Artifact Registry repository.

    Upon success, you will see a SUCCESS message containing the ID, creationtime, and image name. The image is stored in Artifact Registry and can bere-used if required.

    Python

    gcloudbuildssubmit--tagREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub

    Wherepubsub is the name of your service.

    Replace:

    • PROJECT_ID with your Google Cloud project ID
    • REPOSITORY with the name of the Artifact Registry repository.
    • REGION with the Google Cloud region to be used for the Artifact Registry repository.

    Upon success, you will see a SUCCESS message containing the ID, creationtime, and image name. The image is stored in Artifact Registry and can bere-used if required.

    Go

    gcloudbuildssubmit--tagREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub

    Wherepubsub is the name of your service.

    Replace:

    • PROJECT_ID with your Google Cloud project ID
    • REPOSITORY with the name of the Artifact Registry repository.
    • REGION with the Google Cloud region to be used for the Artifact Registry repository.

    Upon success, you will see a SUCCESS message containing the ID, creationtime, and image name. The image is stored in Artifact Registry and can bere-used if required.

    Java

    This sample usesJib to buildDocker images using common Java tools. Jib optimizes container builds withoutthe need for a Dockerfile or havingDockerinstalled. Learn more aboutbuilding Java containers with Jib.

    1. Using the Dockerfile, configure and build a base image with the systempackages installed to override Jib's default base image:

      #Useeclipse-temurinforbaseimage.#It'simportanttouseJDK8u191orabovethathascontainersupportenabled.#https://hub.docker.com/_/eclipse-temurin/#https://docs.docker.com/develop/develop-images/multistage-build/#use-multi-stage-buildsFROMeclipse-temurin:17.0.16_8-jre#InstallImagemagickintothecontainerimage.#Formoreonsystempackagesreviewthesystempackagestutorial.#https://cloud.google.com/run/docs/tutorials/system-packages#dockerfileRUNset-ex;\apt-get-yupdate;\apt-get-yinstallimagemagick;\rm-rf/var/lib/apt/lists/*

      gcloudbuildssubmit--tagREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/imagemagick

      Replace:

      • PROJECT_ID with your Google Cloud project ID
      • REPOSITORY with the name of the Artifact Registry repository.
      • REGION with the Google Cloud region to be used for the Artifact Registry repository.
    2. Use thegcloud credential helperto authorize Docker to push to your Artifact Registry.

      gcloudauthconfigure-docker

    3. Build your final container with Jib and publish on Artifact Registry:

      <plugin><groupId>com.google.cloud.tools</groupId><artifactId>jib-maven-plugin</artifactId><version>3.4.0</version><configuration><from></from><to></to></configuration></plugin>
      mvncompilejib:build\-Dimage=REGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub\-Djib.from.image=REGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/imagemagick

      Replace:

      • PROJECT_ID with your Google Cloud project ID
      • REPOSITORY with the name of the Artifact Registry repository.
      • REGION with the Google Cloud region to be used for the Artifact Registry repository.

  2. Run the following command to deploy your service, using the same service nameyou used in theUse Pub/Sub tutorial:

    Node.js

    gcloudrundeploypubsub-tutorial--imageREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub--set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME--no-allow-unauthenticated

    Python

    gcloudrundeploypubsub-tutorial--imageREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub--set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME--no-allow-unauthenticated

    Go

    gcloudrundeploypubsub-tutorial--imageREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub--set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME--no-allow-unauthenticated

    Java

    gcloudrundeploypubsub-tutorial--imageREGION-docker.pkg.dev/PROJECT_ID/REPOSITORY/pubsub--set-env-vars=BLURRED_BUCKET_NAME=BLURRED_BUCKET_NAME--memory512M--no-allow-unauthenticated

    Wherepubsub isthe container name andpubsub-tutorial is the name of the service.Notice that the container image is deployed to the service andregion (Cloud Run) that you configured previously underSetting up gcloud defaults.Replace:

    • PROJECT_ID with your Google Cloud project ID
    • REPOSITORY with the name of the Artifact Registry repository.
    • REGION with the Google Cloud region to be used for the Artifact Registry repository.
    • BLURRED_BUCKET_NAME with your Cloud Storage bucket youcreated earlier to receive blurred images to set the environment variable.

    The--no-allow-unauthenticated flag restricts unauthenticated access to theservice. By keeping the service private you can rely onCloud Run's automatic Pub/Sub integration to authenticaterequests. SeeIntegrating with Pub/Sub formore details on how this is configured. SeeManaging Access for more details onIAM-based authentication.

    Wait until the deployment is complete: this can take about half a minute.On success, the command line displays the service URL.

Turn on notifications from Cloud Storage

Configure Cloud Storage to publish a message to a Pub/Sub topicwhenever a file (known as an object), is uploaded or changed. Send thenotification to the previously created topic so any new file upload will invokethe service.

gcloud

gcloudstorageservice-agent--project=PROJECT_IDgcloudstoragebucketsnotificationscreategs://INPUT_BUCKET_NAME--topic=myRunTopic--payload-format=json

myRunTopic is the topic you created in the previous tutorial.

ReplaceINPUT_BUCKET_NAME with the name you used when youcreated the buckets.

For more details about storage bucket notifications, readobject change notifications.

Terraform

To learn how to apply or remove a Terraform configuration, seeBasic Terraform commands.

In order to enable notifications, the Cloud Storage service accountunique to the project must exist and have the IAM permissionpubsub.publisher on the Pub/Sub topic. To grant thispermission and create a Cloud Storage notification, add the following toyour existingmain.tf file:

data"google_storage_project_service_account""gcs_account"{}resource"google_pubsub_topic_iam_binding""binding"{topic=google_pubsub_topic.default.namerole="roles/pubsub.publisher"members=["serviceAccount:${data.google_storage_project_service_account.gcs_account.email_address}"]}resource"google_storage_notification""notification"{bucket=google_storage_bucket.imageproc_input.namepayload_format="JSON_API_V1"topic=google_pubsub_topic.default.iddepends_on=[google_pubsub_topic_iam_binding.binding]}

Try it out

  1. Upload an offensive image, such as this image of aflesh-eating zombie:

    curl-ozombie.jpghttps://cdn.pixabay.com/photo/2015/09/21/14/24/zombie-949916_960_720.jpggcloudstoragecpzombie.jpggs://INPUT_BUCKET_NAME

    whereINPUT_BUCKET_NAME is the Cloud Storage bucket you createdearlier for uploading images.

  2. Navigate to the service logs:

    1. Navigate to theCloud Run page in the Google Cloud Console
    2. Click thepubsub-tutorial service.
    3. Select theLogs tab. Logs might take a few moments to appear.If you don't see them immediately, check again after a few moments.
  3. Look for theBlurred image: zombie.png message.

  4. You can view the blurred images in theBLURRED_BUCKET_NAME Cloud Storagebucket you created earlier: locate the bucket in theCloud Storage page in the Google Cloud Console

    Success: You deployed a Cloud Run service with Cloud Vision APIand ImageMagick to detect images uploaded to a Cloud Storage bucket.

Clean up

To avoid additional charges to your Google Cloud account, delete all the resourcesyou deployed with this tutorial.

Delete the project

If you created a new project for this tutorial, delete the project.If you used an existing project and need to keep it without the changes you addedin this tutorial,delete resources that you created for the tutorial.

The easiest way to eliminate billing is to delete the project that you created for the tutorial.

To delete the project:

    Caution: Deleting a project has the following effects:
    • Everything in the project is deleted. If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.
    • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as anappspot.com URL, delete selected resources inside the project instead of deleting the whole project.

    If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects can help you avoid exceeding project quota limits.

  1. In the Google Cloud console, go to theManage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then clickDelete.
  3. In the dialog, type the project ID, and then clickShut down to delete the project.

Delete tutorial resources

  1. Delete the Cloud Run service you deployed in this tutorial.Cloud Run services don't incur costs until they receive requests.

    To delete your Cloud Run service, run the following command:

    gcloudrunservicesdeleteSERVICE-NAME

    ReplaceSERVICE-NAME with the name of your service.

    You can also delete Cloud Run services from theGoogle Cloud console.

  2. Remove thegcloud default region configuration you added during tutorialsetup:

    gcloudconfigunsetrun/region
  3. Remove the project configuration:

     gcloud config unset project
  4. Delete other Google Cloud resources created in this tutorial:

What's next

  • Learn more about persisting data with Cloud Run usingCloud Storage.
  • Understand how to useCloud Vision API to detect things besides explicit content.
  • Explore reference architectures, diagrams, and best practices about Google Cloud.Take a look at ourCloud Architecture Center.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.