Send Sensitive Data Protection inspection job results to Security Command Center

This guide walks you through inspecting data in Cloud Storage,Firestore in Datastore mode (Datastore), or BigQuery and sending the inspection results toSecurity Command Center.

To use this feature, your project must belong to an organization, andSecurity Command Center must be activated at the organization level. Otherwise,Sensitive Data Protection findings won't appear inSecurity Command Center. For more information, seeCheck the activation level ofSecurity Command Center.

For BigQuery data, you can additionally performprofiling, which is different from an inspectionoperation. You can also send data profiles to Security Command Center. For moreinformation, seePublish data profiles toSecurity Command Center.

Overview

Security Command Center enables you to gather data about, identify, and act onsecurity threats before they can cause business damage or loss. WithSecurity Command Center, you can perform several security-related actions from asingle centralized dashboard.

Sensitive Data Protection has built-in integration with Security Command Center. When you usea Sensitive Data Protectionaction to inspect yourGoogle Cloud storage repositories for sensitive data, it can send resultsdirectly to the Security Command Center dashboard. They display next to othersecurity metrics.

By completing the steps in this guide, you do the following:

  • Enable Security Command Center and Sensitive Data Protection.
  • Set up Sensitive Data Protection to inspect a Google Cloud storagerepository—either a Cloud Storage bucket, BigQuery table, orDatastore kind.
  • Configure a Sensitive Data Protection scan to send the inspection job results toSecurity Command Center.

For more information about Security Command Center, see theSecurity Command Center documentation.

If you want to send the results of discovery scans—not inspectionjobs—to Security Command Center, see the documentation forprofiling an organization,folder, orproject instead.

Costs

In this document, you use the following billable components of Google Cloud:

  • Sensitive Data Protection
  • Cloud Storage
  • BigQuery
  • Datastore

To generate a cost estimate based on your projected usage, use thepricing calculator.

New Google Cloud users might be eligible for afree trial.

Before you begin

Before you can send Sensitive Data Protection scan results to Security Command Center,you must do each of the following:

  • Step 1: Set Google Cloud storage repositories.
  • Step 2: Set Identity and Access Management (IAM) roles.
  • Step 3: Enable Security Command Center.
  • Step 4: Enable Sensitive Data Protection.
  • Step 5: Enable Sensitive Data Protection as a security source forSecurity Command Center.

The steps to set up these components are described in the following sections.

Step 1: Set Google Cloud storage repositories

Choose whether you want to scan your own Google Cloud storagerepository or an example one. This topic provides instructions for bothscenarios.

Scan your own data

If you want to scan your own existing Cloud Storage bucket,BigQuery table, or Datastore kind, first open theproject that the repository is in. In subsequent steps, you'll enable bothSecurity Command Center and Sensitive Data Protection for this project and itsorganization.

After you open the project you want to use, proceed toStep 2 to set up someIAM roles.

Scan sample data

If you want to scan a test set of data, first make sure that you havea billing account set up, and then create a new project. To complete this step,you must have the IAMProject Creator role. Learn more aboutIAM roles.

  1. If you don't already have billing configured, set up a billing account.

    Learn how to enable billing

  2. Go to theNew Project page in the Google Cloud console.

    Go to New Project

  3. On theBilling account drop-down list, select the billing account that the project should be billed to.
  4. On theOrganization drop-down list, select the organization that you want to create the project in.
  5. On theLocation drop-down list, select the organization or folder that you want to create the project in.

Next, download and store the sample data:

  1. Go to theCloud Run functions tutorials repository on GitHub.
  2. ClickClone or download, and then clickDownload ZIP.
  3. Extract the zip file that you downloaded.
  4. Go to theStorage Browser page in the Google Cloud console.

    Go to Cloud Storage

  5. ClickCreate bucket.
  6. On theCreate a bucket page, give the bucket a unique name, and then clickCreate.
  7. On theBucket details page, clickUpload folder.
  8. Go to thedlp-cloud-functions-tutorials-master folder that you extracted, open it, and then select thesample_data folder. ClickUpload to upload the folder's contents to Cloud Storage.

Note the name that you gave the Cloud Storage bucket for later. After thefile upload completes, you're ready to continue.

Step 2: Set IAM roles

To use Sensitive Data Protection to send scan results to Security Command Center, youneed theSecurity Center Admin andSensitive Data Protection Jobs EditorIAM roles. This section describes how to add the roles. Tocomplete this section, you must have theOrganization AdministratorIAM role.

  1. Go to the IAM page.

    Go to IAM

  2. On theView by principals tab, find your Google Account and clickEdit principal.

    Note: If you can't find your account, make sure that you're looking at an organization and not a project. To select the correct organization, use the project selector drop-down list.

    If an alert message displays that "You do not have sufficient permissions to view this page," make sure that you have theOrganization Administrator IAM role. For more information, seeIAM documentation.

  3. Add theSecurity Center Admin andSensitive Data Protection Jobs Editor roles:

    1. In theEdit access panel, clickAdd another role.
    2. In theSelect a role list, search forSecurity Center Admin,and select it.
    3. ClickAdd another role.
    4. In theSelect a role list, search forDLP Jobs Editor, andselect it.
    5. ClickSave.

You now have Sensitive Data Protection Jobs Editor and Security Center Admin rolesfor your organization. These roles let you complete the tasks in theremainder of this topic.

Step 3: Enable Security Command Center

  1. Go to theSecurity Command Center page in the Google Cloud console.

    Go to Security Command Center

  2. On theOrganization drop-down list, select the organization for which you want to enable Sensitive Data Protection, and then clickSelect.

  3. On theEnable asset discovery page that appears, selectAll current and future projects, and then clickEnable. A message should display that Sensitive Data Protection is beginning asset discovery.

After asset discovery is complete, Sensitive Data Protection will display yoursupported Google Cloud assets. Asset discovery might take a few minutes,and you might need to refresh the page to display the assets.

For more information about enabling Security Command Center, see theSecurity Command Center documentation.

Step 4: Enable Sensitive Data Protection

Enable Sensitive Data Protection for the project you want to scan. The projectmust be within the same organization for which you've enabledSecurity Command Center. To enable Sensitive Data Protection using theGoogle Cloud console:

  1. In the Google Cloud console, go to theEnable access to API page.

    Enable the API

  2. On the toolbar, select the project from Step 1 of this guide. The project must contain the Cloud Storage bucket, BigQuery table, or Datastore kind you want to scan.
  3. ClickNext.
  4. ClickEnable.

Sensitive Data Protection is now enabled for your project.

Step 5: Enable Sensitive Data Protection as an integrated service for Security Command Center

To view Sensitive Data Protection scan findings in Security Command Center,enableSensitive Data Protection as an integrated service. Formore information, seeAdd a Google Cloud integratedservicein the Security Command Center documentation.

Findings for Sensitive Data Protection are displayed on theFindings page inSecurity Command Center.

Configure and run a Sensitive Data Protection inspection scan

In this section, you configure and run a Sensitive Data Protection inspection job.

Note: These instructions useAPIsExplorer to send requests thatcontain JSON to Sensitive Data Protection. This tool is for learning, demonstration,and experimentation purposes only, and you should never use APIs Explorer forproduction work. You can usegcloud,curl, orhttplib2 to send theserequests instead. You can also use one of the available Sensitive Data Protectionclient libraries to access the API using Java, Python,Go, and more.

Theinspection job that you configure here instructs Sensitive Data Protection toscan either thesample data stored in Cloud Storageor your own data stored in Cloud Storage, Datastore, orBigQuery. The job configuration that you specify is also whereyou instruct Sensitive Data Protection to save its scan results toSecurity Command Center.

Step 1: Note your project identifier

  1. Go to the Google Cloud console.

    Go to the Google Cloud console

  2. ClickSelect.
  3. On theSelect from drop-down list, select the organization for which you enabled Security Command Center.
  4. UnderID, copy the project ID for theproject that contains the data you want to scan.
  5. UnderName, click the project to select it.

Step 2: Open APIs Explorer and configure the job

  1. Go to APIs Explorer on the reference page for thedlpJobs.create method by clicking the following button:

    Open APIs Explorer

  2. In theparent box, enter the following, wherePROJECT_ID is the project ID you noted in Step 1:
    projects/PROJECT_ID

Replace the contents of theRequest body field with the following JSON forthe kind of data you want to use: sample data in a Cloud Storage bucket, oryour own data stored in Cloud Storage, Datastore, orBigQuery.

Sample data

If you created a Cloud Storage bucket tostore sample data,copy the following JSON and then paste it into theRequestbody field. ReplaceBUCKET_NAME with the namethat you gave your Cloud Storage bucket:

{  "inspectJob":{    "storageConfig":{      "cloudStorageOptions":{        "fileSet":{          "url":"gs://BUCKET_NAME/**"        }      }    },    "inspectConfig":{      "infoTypes":[        {          "name":"EMAIL_ADDRESS"        },        {          "name":"PERSON_NAME"        },        {          "name": "LOCATION"        },        {          "name":"PHONE_NUMBER"        }      ],      "includeQuote":true,      "minLikelihood":"UNLIKELY",      "limits":{        "maxFindingsPerRequest":100      }    },    "actions":[      {        "publishSummaryToCscc":{        }      }    ]  }}

Cloud Storage data

To scan your own Cloud Storage bucket, copy the following JSON andpaste it into theRequest body field.

ReplacePATH_NAME with the path to the locationthat you want to scan. To scan recursively, end the path with two asterisks,for example,gs://path_to_files/**. To scan a specific directory andno deeper, end the path with one asterisk, for example,gs://path_to_files/*.

{  "inspectJob":{    "storageConfig":{      "cloudStorageOptions":{        "fileSet":{          "url":"gs://PATH_NAME"        }      }    },    "inspectConfig":{      "infoTypes":[        {          "name":"EMAIL_ADDRESS"        },        {          "name":"PERSON_NAME"        },        {          "name": "LOCATION"        },        {          "name":"PHONE_NUMBER"        }      ],      "includeQuote":true,      "minLikelihood":"UNLIKELY",      "limits":{        "maxFindingsPerRequest":100      }    },    "actions":[      {        "publishSummaryToCscc":{        }      }    ]  }}

To learn more about the available scan options, seeInspecting storageand databases for sensitive data.

Datastore data

To scan your own data kept in Datastore, copy the followingJSON and paste it into theRequest body field.

ReplaceDATASTORE_KIND with the name of theDatastore kind. You can also replaceNAMESPACE_ID andPROJECT_ID with the namespace and projectidentifiers, repectively, or you can remove the"partitionID" completelyif you want.

{  "inspectJob":{    "storageConfig":{      "datastoreOptions":{        "kind":{          "name":"DATASTORE_KIND"        },        "partitionId":{          "namespaceId":"NAMESPACE_ID",          "projectId":"PROJECT_ID"        }      }    },    "inspectConfig":{      "infoTypes":[        {          "name":"EMAIL_ADDRESS"        },        {          "name":"PERSON_NAME"        },        {          "name": "LOCATION"        },        {          "name":"PHONE_NUMBER"        }      ],      "includeQuote":true,      "minLikelihood":"UNLIKELY",      "limits":{        "maxFindingsPerRequest":100      }    },    "actions":[      {        "publishSummaryToCscc":{        }      }    ]  }}

To learn more about the available scan options, seeInspecting storageand databases for sensitive data.

BigQuery data

To scan your own BigQuery table, copy the followingJSON and paste it into theRequest body field.

ReplacePROJECT_ID,BIGQUERY_DATASET_NAME, andBIGQUERY_TABLE_NAME with the project ID andBigQuery dataset and table names, repectively.

{  "inspectJob":  {    "storageConfig":    {      "bigQueryOptions":      {        "tableReference":        {          "projectId": "PROJECT_ID",          "datasetId": "BIGQUERY_DATASET_NAME",          "tableId": "BIGQUERY_TABLE_NAME"        }      }    },    "inspectConfig":    {      "infoTypes":      [        {          "name": "EMAIL_ADDRESS"        },        {          "name": "PERSON_NAME"        },        {          "name": "LOCATION"        },        {          "name": "PHONE_NUMBER"        }      ],      "includeQuote": true,      "minLikelihood": "UNLIKELY",      "limits":      {        "maxFindingsPerRequest": 100      }    },    "actions":    [      {        "publishSummaryToCscc":        {        }      }    ]  }}

To learn more about the available scan options, seeInspecting storageand databases for sensitive data.

Note: For a full list of information types that Sensitive Data Protection can scanfor and detect, seeInfoTypes reference.

Step 3: Execute the request to start the inspection job

After you configure the job by following the preceding steps, clickExecuteto send the request. If the request is successful, a response appears belowthe request with a success code and a JSON object that indicates the status ofthe Sensitive Data Protection job that you created.

Important: Keep the page open or copy the response JSON so that you have it forthe remainder of the tasks in this topic.

Check the status of the Sensitive Data Protection inspection scan

The response to your scan request includes the job ID of your inspection scanjob as the"name" key, and the current state of the inspection job as the"state" key. Immediately after you submit the request, the job's stateis"PENDING".

After you submit the scan request, the scan of your content begins immediately.

To check the status of the inspection job:

  1. Go to APIs Explorer on the reference page for thedlpJobs.get method by clicking the following button:

    Open APIs Explorer

  2. In thename box, type the name of the job from the JSON response to the scan request in the following form:
    projects/PROJECT_ID/dlpJobs/JOB_ID
    The job ID is in the form ofi-1234567890123456789.
  3. To submit the request, clickExecute.

If the response JSON object's"state" key indicates that the job is"DONE",then the inspection job has finished.

Note: You can instruct Sensitive Data Protection to send you a notification when ajob is done. To learn more, see theActions conceptual page.

To view the rest of the response JSON, scroll down the page. Under"result" >"infoTypeStats", each information type listed should have a corresponding"count". If not, make sure that you entered the JSON accurately, and that thepath or location to your data is correct.

After the inspection job is done, you can continue to the next section of this guideto view scan results in Security Command Center.

Code samples: inspect a Cloud Storage bucket

This example demonstrates how to use the DLP API to create aninspection job that inspects a Cloud Storage bucket and sends findings toSecurity Command Center.

C#

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

usingSystem.Collections.Generic;usingSystem.Linq;usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;usingstaticGoogle.Cloud.Dlp.V2.InspectConfig.Types;publicclassInspectStorageWithSCCIntegration{publicstaticDlpJobSendGcsData(stringprojectId,stringgcsPath,LikelihoodminLikelihood=Likelihood.Unlikely,IEnumerable<InfoType>infoTypes=null){// Instantiate the dlp client.vardlp=DlpServiceClient.Create();// Specify the GCS file to be inspected.varstorageConfig=newStorageConfig{CloudStorageOptions=newCloudStorageOptions{FileSet=newCloudStorageOptions.Types.FileSet{Url=gcsPath}}};// Specify the type of info to be inspected and construct the inspect config.varinspectConfig=newInspectConfig{InfoTypes={infoTypes??newInfoType[]{newInfoType{Name="EMAIL_ADDRESS"},newInfoType{Name="PERSON_NAME"},newInfoType{Name="LOCATION"},newInfoType{Name="PHONE_NUMBER"}}},IncludeQuote=true,MinLikelihood=minLikelihood,Limits=newFindingLimits{MaxFindingsPerRequest=100}};// Construct the SCC action which will be performed after inspecting the storage.varactions=newAction[]{newAction{PublishSummaryToCscc=newAction.Types.PublishSummaryToCscc()}};// Construct the inspect job config using storage config, inspect config and action.varinspectJob=newInspectJobConfig{StorageConfig=storageConfig,InspectConfig=inspectConfig,Actions={actions}};// Construct the request.varrequest=newCreateDlpJobRequest{ParentAsLocationName=newLocationName(projectId,"global"),InspectJob=inspectJob};// Call the API.DlpJobresponse=dlp.CreateDlpJob(request);returnresponse;}}

Go

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// inspectGCSFileSendToScc inspects sensitive data in a Google Cloud Storage (GCS) file// and sends the inspection results to Google Cloud Security Command Center (SCC) for further analysis.funcinspectGCSFileSendToScc(wio.Writer,projectID,gcsPathstring)error{// projectID := "my-project-id"// gcsPath := "gs://" + "your-bucket-name" + "path/to/file.txt"ctx:=context.Background()// Initialize a client once and reuse it to send multiple requests. Clients// are safe to use across goroutines. When the client is no longer needed,// call the Close method to cleanup its resources.client,err:=dlp.NewClient(ctx)iferr!=nil{returnerr}// Closing the client safely cleans up background resources.deferclient.Close()// Specify the GCS file to be inspected.cloudStorageOptions:=&dlppb.CloudStorageOptions{FileSet:&dlppb.CloudStorageOptions_FileSet{Url:gcsPath,},}// storageCfg represents the configuration for data inspection in various storage types.storageConfig:=&dlppb.StorageConfig{Type:&dlppb.StorageConfig_CloudStorageOptions{CloudStorageOptions:cloudStorageOptions,},}// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesinfoTypes:=[]*dlppb.InfoType{{Name:"EMAIL_ADDRESS"},{Name:"PERSON_NAME"},{Name:"LOCATION"},{Name:"PHONE_NUMBER"},}// The minimum likelihood required before returning a match.minLikelihood:=dlppb.Likelihood_UNLIKELY// The maximum number of findings to report (0 = server maximum).findingLimits:=&dlppb.InspectConfig_FindingLimits{MaxFindingsPerItem:100,}inspectConfig:=&dlppb.InspectConfig{InfoTypes:infoTypes,MinLikelihood:minLikelihood,Limits:findingLimits,IncludeQuote:true,}// Specify the action that is triggered when the job completes.action:=&dlppb.Action{Action:&dlppb.Action_PublishSummaryToCscc_{PublishSummaryToCscc:&dlppb.Action_PublishSummaryToCscc{},},}// Configure the inspection job we want the service to perform.inspectJobConfig:=&dlppb.InspectJobConfig{StorageConfig:storageConfig,InspectConfig:inspectConfig,Actions:[]*dlppb.Action{action,},}// Create the request for the job configured above.req:=&dlppb.CreateDlpJobRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),Job:&dlppb.CreateDlpJobRequest_InspectJob{InspectJob:inspectJobConfig,},}// Send the request.resp,err:=client.CreateDlpJob(ctx,req)iferr!=nil{returnerr}// Print the result.fmt.Fprintf(w,"Job created successfully: %v",resp.Name)returnnil}

Java

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.Action;importcom.google.privacy.dlp.v2.CloudStorageOptions;importcom.google.privacy.dlp.v2.CreateDlpJobRequest;importcom.google.privacy.dlp.v2.DlpJob;importcom.google.privacy.dlp.v2.InfoType;importcom.google.privacy.dlp.v2.InfoTypeStats;importcom.google.privacy.dlp.v2.InspectConfig;importcom.google.privacy.dlp.v2.InspectDataSourceDetails;importcom.google.privacy.dlp.v2.InspectJobConfig;importcom.google.privacy.dlp.v2.Likelihood;importcom.google.privacy.dlp.v2.LocationName;importcom.google.privacy.dlp.v2.StorageConfig;importjava.io.IOException;importjava.util.List;importjava.util.concurrent.TimeUnit;importjava.util.stream.Collectors;importjava.util.stream.Stream;publicclassInspectGcsFileSendToScc{privatestaticfinalintTIMEOUT_MINUTES=15;publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.// The Google Cloud project id to use as a parent resource.StringprojectId="your-project-id";// The name of the file in the Google Cloud Storage bucket.StringgcsPath="gs://"+"your-bucket-name"+"path/to/file.txt";createJobSendToScc(projectId,gcsPath);}// Creates a DLP Job to scan the sample data stored in a Cloud Storage and save its scan results// to Security Command Center.publicstaticvoidcreateJobSendToScc(StringprojectId,StringgcsPath)throwsIOException,InterruptedException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Specify the GCS file to be inspected.CloudStorageOptionscloudStorageOptions=CloudStorageOptions.newBuilder().setFileSet(CloudStorageOptions.FileSet.newBuilder().setUrl(gcsPath)).build();StorageConfigstorageConfig=StorageConfig.newBuilder().setCloudStorageOptions(cloudStorageOptions).build();// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesList<InfoType>infoTypes=Stream.of("EMAIL_ADDRESS","PERSON_NAME","LOCATION","PHONE_NUMBER").map(it->InfoType.newBuilder().setName(it).build()).collect(Collectors.toList());// The minimum likelihood required before returning a match.// See: https://cloud.google.com/dlp/docs/likelihoodLikelihoodminLikelihood=Likelihood.UNLIKELY;// The maximum number of findings to report (0 = server maximum)InspectConfig.FindingLimitsfindingLimits=InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();InspectConfiginspectConfig=InspectConfig.newBuilder().addAllInfoTypes(infoTypes).setIncludeQuote(true).setMinLikelihood(minLikelihood).setLimits(findingLimits).build();// Specify the action that is triggered when the job completes.Action.PublishSummaryToCsccpublishSummaryToCscc=Action.PublishSummaryToCscc.getDefaultInstance();Actionaction=Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();// Configure the inspection job we want the service to perform.InspectJobConfiginspectJobConfig=InspectJobConfig.newBuilder().setInspectConfig(inspectConfig).setStorageConfig(storageConfig).addActions(action).build();// Construct the job creation request to be sent by the client.CreateDlpJobRequestcreateDlpJobRequest=CreateDlpJobRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).setInspectJob(inspectJobConfig).build();// Send the job creation request and process the response.DlpJobresponse=dlpServiceClient.createDlpJob(createDlpJobRequest);// Get the current time.longstartTime=System.currentTimeMillis();// Check if the job state is DONE.while(response.getState()!=DlpJob.JobState.DONE){// Sleep for 30 second.Thread.sleep(30000);// Get the updated job status.response=dlpServiceClient.getDlpJob(response.getName());// Check if the timeout duration has exceeded.longelapsedTime=System.currentTimeMillis()-startTime;if(TimeUnit.MILLISECONDS.toMinutes(elapsedTime)>=TIMEOUT_MINUTES){System.out.printf("Job did not complete within %d minutes.%n",TIMEOUT_MINUTES);break;}}// Print the results.System.out.println("Job status: "+response.getState());System.out.println("Job name: "+response.getName());InspectDataSourceDetails.Resultresult=response.getInspectDetails().getResult();System.out.println("Findings: ");for(InfoTypeStatsinfoTypeStat:result.getInfoTypeStatsList()){System.out.print("\tInfo type: "+infoTypeStat.getInfoType().getName());System.out.println("\tCount: "+infoTypeStat.getCount());}}}}

Node.js

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlpClient=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'your-project-id';// The name of the file in the bucket// const gcsPath = 'gcs-file-path';asyncfunctioninspectGCSSendToScc(){// Specify the storage configuration object with GCS URL.conststorageConfig={cloudStorageOptions:{fileSet:{url:gcsPath,},},};// Construct the info types to look for in the GCS file.constinfoTypes=[{name:'EMAIL_ADDRESS'},{name:'PERSON_NAME'},{name:'LOCATION'},{name:'PHONE_NUMBER'},];// Construct the inspection configuration.constinspectConfig={infoTypes,minLikelihood:DLP.protos.google.privacy.dlp.v2.Likelihood.UNLIKELY,limits:{maxFindingsPerItem:100,},};// Specify the action that is triggered when the job completes.constaction={publishSummaryToCscc:{},};// Configure the inspection job we want the service to perform.constjobConfig={inspectConfig,storageConfig,actions:[action],};// Construct the job creation request to be sent by the client.constrequest={parent:`projects/${projectId}/locations/global`,inspectJob:jobConfig,};// Send the job creation request and process the response.const[jobsResponse]=awaitdlpClient.createDlpJob(request);constjobName=jobsResponse.name;// Waiting for a maximum of 15 minutes for the job to get complete.letjob;letnumOfAttempts=30;while(numOfAttempts >0){// Fetch DLP Job status[job]=awaitdlpClient.getDlpJob({name:jobName});// Check if the job has completed.if(job.state==='DONE'){break;}if(job.state==='FAILED'){console.log('Job Failed, Please check the configuration.');return;}// Sleep for a short duration before checking the job status again.awaitnewPromise(resolve=>{setTimeout(()=>resolve(),30000);});numOfAttempts-=1;}// Print out the results.constinfoTypeStats=job.inspectDetails.result.infoTypeStats;if(infoTypeStats.length >0){infoTypeStats.forEach(infoTypeStat=>{console.log(`Found${infoTypeStat.count} instance(s) of infoType${infoTypeStat.infoType.name}.`);});}else{console.log('No findings.');}}awaitinspectGCSSendToScc();

PHP

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

use Google\Cloud\Dlp\V2\Action;use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\CloudStorageOptions;use Google\Cloud\Dlp\V2\CloudStorageOptions\FileSet;use Google\Cloud\Dlp\V2\CreateDlpJobRequest;use Google\Cloud\Dlp\V2\DlpJob\JobState;use Google\Cloud\Dlp\V2\GetDlpJobRequest;use Google\Cloud\Dlp\V2\InfoType;use Google\Cloud\Dlp\V2\InspectConfig;use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;use Google\Cloud\Dlp\V2\InspectJobConfig;use Google\Cloud\Dlp\V2\Likelihood;use Google\Cloud\Dlp\V2\StorageConfig;/** * (GCS) Send Cloud DLP scan results to Security Command Center. * Using Cloud Data Loss Prevention to scan specific Google Cloud resources and send data to Security Command Center. * * @param string $callingProjectId  The project ID to run the API call under. * @param string $gcsUri            GCS file to be inspected. */function inspect_gcs_send_to_scc(    // TODO(developer): Replace sample parameters before running the code.    string $callingProjectId,    string $gcsUri = 'gs://GOOGLE_STORAGE_BUCKET_NAME/dlp_sample.csv'): void {    // Instantiate a client.    $dlp = new DlpServiceClient();    // Construct the items to be inspected.    $cloudStorageOptions = (new CloudStorageOptions())        ->setFileSet((new FileSet())            ->setUrl($gcsUri));    $storageConfig = (new StorageConfig())        ->setCloudStorageOptions(($cloudStorageOptions));    // Specify the type of info the inspection will look for.    $infoTypes = [        (new InfoType())->setName('EMAIL_ADDRESS'),        (new InfoType())->setName('PERSON_NAME'),        (new InfoType())->setName('LOCATION'),        (new InfoType())->setName('PHONE_NUMBER')    ];    // Specify how the content should be inspected.    $inspectConfig = (new InspectConfig())        ->setMinLikelihood(likelihood::UNLIKELY)        ->setLimits((new FindingLimits())            ->setMaxFindingsPerRequest(100))        ->setInfoTypes($infoTypes)        ->setIncludeQuote(true);    // Specify the action that is triggered when the job completes.    $action = (new Action())        ->setPublishSummaryToCscc(new PublishSummaryToCscc());    // Construct inspect job config to run.    $inspectJobConfig = (new InspectJobConfig())        ->setInspectConfig($inspectConfig)        ->setStorageConfig($storageConfig)        ->setActions([$action]);    // Send the job creation request and process the response.    $parent = "projects/$callingProjectId/locations/global";    $createDlpJobRequest = (new CreateDlpJobRequest())        ->setParent($parent)        ->setInspectJob($inspectJobConfig);    $job = $dlp->createDlpJob($createDlpJobRequest);    $numOfAttempts = 10;    do {        printf('Waiting for job to complete' . PHP_EOL);        sleep(10);        $getDlpJobRequest = (new GetDlpJobRequest())            ->setName($job->getName());        $job = $dlp->getDlpJob($getDlpJobRequest);        if ($job->getState() == JobState::DONE) {            break;        }        $numOfAttempts--;    } while ($numOfAttempts > 0);    // Print finding counts.    printf('Job %s status: %s' . PHP_EOL, $job->getName(), JobState::name($job->getState()));    switch ($job->getState()) {        case JobState::DONE:            $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();            if (count($infoTypeStats) === 0) {                printf('No findings.' . PHP_EOL);            } else {                foreach ($infoTypeStats as $infoTypeStat) {                    printf(                        '  Found %s instance(s) of infoType %s' . PHP_EOL,                        $infoTypeStat->getCount(),                        $infoTypeStat->getInfoType()->getName()                    );                }            }            break;        case JobState::FAILED:            printf('Job %s had errors:' . PHP_EOL, $job->getName());            $errors = $job->getErrors();            foreach ($errors as $error) {                var_dump($error->getDetails());            }            break;        case JobState::PENDING:            printf('Job has not completed. Consider a longer timeout or an asynchronous execution model' . PHP_EOL);            break;        default:            printf('Unexpected job state. Most likely, the job is either running or has not yet started.');    }}

Python

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importtimefromtypingimportListimportgoogle.cloud.dlpdefinspect_gcs_send_to_scc(project:str,bucket:str,info_types:List[str],max_findings:int=100,)->None:"""    Uses the Data Loss Prevention API to inspect Google Cloud Storage    data and send the results to Google Security Command Center.    Args:        project: The Google Cloud project id to use as a parent resource.        bucket: The name of the GCS bucket containing the file, as a string.        info_types: A list of strings representing infoTypes to inspect for.            A full list of infoType categories can be fetched from the API.        max_findings: The maximum number of findings to report; 0 = no maximum.    """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Prepare info_types by converting the list of strings into a list of# dictionaries.info_types=[{"name":info_type}forinfo_typeininfo_types]# Construct the configuration dictionary.inspect_config={"info_types":info_types,"min_likelihood":google.cloud.dlp_v2.Likelihood.UNLIKELY,"limits":{"max_findings_per_request":max_findings},"include_quote":True,}# Construct a cloud_storage_options dictionary with the bucket's URL.url=f"gs://{bucket}"storage_config={"cloud_storage_options":{"file_set":{"url":url}}}# Tell the API where to send a notification when the job is complete.actions=[{"publish_summary_to_cscc":{}}]# Construct the job definition.job={"inspect_config":inspect_config,"storage_config":storage_config,"actions":actions,}# Convert the project id into a full resource id.parent=f"projects/{project}"# Call the API.response=dlp.create_dlp_job(request={"parent":parent,"inspect_job":job,})print(f"Inspection Job started :{response.name}")job_name=response.name# Waiting for maximum 15 minutes for the job to get complete.no_of_attempts=30whileno_of_attempts >0:# Get the DLP job status.job=dlp.get_dlp_job(request={"name":job_name})# Check if the job has completed.ifjob.state==google.cloud.dlp_v2.DlpJob.JobState.DONE:breakelifjob.state==google.cloud.dlp_v2.DlpJob.JobState.FAILED:print("Job Failed, Please check the configuration.")return# Sleep for a short duration before checking the job status again.time.sleep(30)no_of_attempts-=1# Print out the results.print(f"Job name:{job.name}")result=job.inspect_details.resultprint("Processed Bytes: ",result.processed_bytes)ifresult.info_type_stats:forstatsinresult.info_type_stats:print(f"Info type:{stats.info_type.name}")print(f"Count:{stats.count}")else:print("No findings.")

Code samples: inspect a BigQuery table

This example demonstrates how to use the DLP API to create aninspection job that inspects a BigQuery table and sends findingsto Security Command Center.

C#

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

usingSystem.Collections.Generic;usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;usingstaticGoogle.Cloud.Dlp.V2.InspectConfig.Types;publicclassInspectBigQueryWithSCCIntegration{publicstaticDlpJobSendBigQueryData(stringprojectId,LikelihoodminLikelihood=Likelihood.Unlikely,IEnumerable<InfoType>infoTypes=null){// Instantiate the dlp client.vardlp=DlpServiceClient.Create();// Construct the storage config by providing the table to be inspected.varstorageConfig=newStorageConfig{BigQueryOptions=newBigQueryOptions{TableReference=newBigQueryTable{ProjectId="bigquery-public-data",DatasetId="usa_names",TableId="usa_1910_current",}}};// Construct the inspect config by specifying the type of info to be inspected.varinspectConfig=newInspectConfig{InfoTypes={infoTypes??newInfoType[]{newInfoType{Name="EMAIL_ADDRESS"},newInfoType{Name="PERSON_NAME"}}},IncludeQuote=true,MinLikelihood=minLikelihood,Limits=newFindingLimits{MaxFindingsPerRequest=100}};// Construct the SCC action which will be performed after inspecting the source.varactions=newAction[]{newAction{PublishSummaryToCscc=newAction.Types.PublishSummaryToCscc()}};// Construct the inspect job config using storage config, inspect config and action.varinspectJob=newInspectJobConfig{StorageConfig=storageConfig,InspectConfig=inspectConfig,Actions={actions}};// Construct the request.varrequest=newCreateDlpJobRequest{ParentAsLocationName=newLocationName(projectId,"global"),InspectJob=inspectJob};// Call the API.DlpJobresponse=dlp.CreateDlpJob(request);System.Console.WriteLine($"Job created successfully. Job name: {response.Name}");returnresponse;}}

Go

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// inspectBigQuerySendToScc configures the inspection job that instructs Cloud DLP to scan data stored in BigQuery,// and also instructs Cloud DLP to save its scan results to Security Command Center.funcinspectBigQuerySendToScc(wio.Writer,projectID,bigQueryDatasetId,bigQueryTableIdstring)error{// projectID := "my-project-id"// bigQueryDatasetId := "your-project-bigquery-dataset"// bigQueryTableId := "your-project-bigquery_table"ctx:=context.Background()// Initialize a client once and reuse it to send multiple requests. Clients// are safe to use across goroutines. When the client is no longer needed,// call the Close method to cleanup its resources.client,err:=dlp.NewClient(ctx)iferr!=nil{returnerr}// Closing the client safely cleans up background resources.deferclient.Close()// Specify the BigQuery table to be inspected.tableReference:=&dlppb.BigQueryTable{ProjectId:projectID,DatasetId:bigQueryDatasetId,TableId:bigQueryTableId,}bigQueryOptions:=&dlppb.BigQueryOptions{TableReference:tableReference,}// Specify the type of storage that you have configured.storageConfig:=&dlppb.StorageConfig{Type:&dlppb.StorageConfig_BigQueryOptions{BigQueryOptions:bigQueryOptions,},}// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types.infoTypes:=[]*dlppb.InfoType{{Name:"EMAIL_ADDRESS"},{Name:"PERSON_NAME"},{Name:"LOCATION"},{Name:"PHONE_NUMBER"},}// The minimum likelihood required before returning a match.minLikelihood:=dlppb.Likelihood_UNLIKELY// The maximum number of findings to report (0 = server maximum).findingLimits:=&dlppb.InspectConfig_FindingLimits{MaxFindingsPerItem:100,}// Specify how the content should be inspected.inspectConfig:=&dlppb.InspectConfig{InfoTypes:infoTypes,MinLikelihood:minLikelihood,Limits:findingLimits,IncludeQuote:true,}// Specify the action that is triggered when the job completes.action:=&dlppb.Action{Action:&dlppb.Action_PublishSummaryToCscc_{PublishSummaryToCscc:&dlppb.Action_PublishSummaryToCscc{},},}// Configure the inspection job we want the service to perform.inspectJobConfig:=&dlppb.InspectJobConfig{StorageConfig:storageConfig,InspectConfig:inspectConfig,Actions:[]*dlppb.Action{action,},}// Create the request for the job configured above.req:=&dlppb.CreateDlpJobRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),Job:&dlppb.CreateDlpJobRequest_InspectJob{InspectJob:inspectJobConfig,},}// Send the request.resp,err:=client.CreateDlpJob(ctx,req)iferr!=nil{returnerr}// Print the resultfmt.Fprintf(w,"Job created successfully: %v",resp.Name)returnnil}

Java

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.Action;importcom.google.privacy.dlp.v2.BigQueryOptions;importcom.google.privacy.dlp.v2.BigQueryTable;importcom.google.privacy.dlp.v2.CreateDlpJobRequest;importcom.google.privacy.dlp.v2.DlpJob;importcom.google.privacy.dlp.v2.InfoType;importcom.google.privacy.dlp.v2.InfoTypeStats;importcom.google.privacy.dlp.v2.InspectConfig;importcom.google.privacy.dlp.v2.InspectDataSourceDetails;importcom.google.privacy.dlp.v2.InspectJobConfig;importcom.google.privacy.dlp.v2.Likelihood;importcom.google.privacy.dlp.v2.LocationName;importcom.google.privacy.dlp.v2.StorageConfig;importjava.util.List;importjava.util.concurrent.TimeUnit;importjava.util.stream.Collectors;importjava.util.stream.Stream;publicclassInspectBigQuerySendToScc{privatestaticfinalintTIMEOUT_MINUTES=15;publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.// The Google Cloud project id to use as a parent resource.StringprojectId="your-project-id";// The BigQuery dataset id to be used and the reference table name to be inspected.StringbigQueryDatasetId="your-project-bigquery-dataset";StringbigQueryTableId="your-project-bigquery_table";inspectBigQuerySendToScc(projectId,bigQueryDatasetId,bigQueryTableId);}// Inspects a BigQuery Table to send data to Security Command Center.publicstaticvoidinspectBigQuerySendToScc(StringprojectId,StringbigQueryDatasetId,StringbigQueryTableId)throwsException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Specify the BigQuery table to be inspected.BigQueryTabletableReference=BigQueryTable.newBuilder().setProjectId(projectId).setDatasetId(bigQueryDatasetId).setTableId(bigQueryTableId).build();BigQueryOptionsbigQueryOptions=BigQueryOptions.newBuilder().setTableReference(tableReference).build();StorageConfigstorageConfig=StorageConfig.newBuilder().setBigQueryOptions(bigQueryOptions).build();// Specify the type of info the inspection will look for.List<InfoType>infoTypes=Stream.of("EMAIL_ADDRESS","PERSON_NAME","LOCATION","PHONE_NUMBER").map(it->InfoType.newBuilder().setName(it).build()).collect(Collectors.toList());// The minimum likelihood required before returning a match.LikelihoodminLikelihood=Likelihood.UNLIKELY;// The maximum number of findings to report (0 = server maximum)InspectConfig.FindingLimitsfindingLimits=InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();// Specify how the content should be inspected.InspectConfiginspectConfig=InspectConfig.newBuilder().addAllInfoTypes(infoTypes).setIncludeQuote(true).setMinLikelihood(minLikelihood).setLimits(findingLimits).build();// Specify the action that is triggered when the job completes.Action.PublishSummaryToCsccpublishSummaryToCscc=Action.PublishSummaryToCscc.getDefaultInstance();Actionaction=Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();// Configure the inspection job we want the service to perform.InspectJobConfiginspectJobConfig=InspectJobConfig.newBuilder().setInspectConfig(inspectConfig).setStorageConfig(storageConfig).addActions(action).build();// Construct the job creation request to be sent by the client.CreateDlpJobRequestcreateDlpJobRequest=CreateDlpJobRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).setInspectJob(inspectJobConfig).build();// Send the job creation request and process the response.DlpJobresponse=dlpServiceClient.createDlpJob(createDlpJobRequest);// Get the current time.longstartTime=System.currentTimeMillis();// Check if the job state is DONE.while(response.getState()!=DlpJob.JobState.DONE){// Sleep for 30 second.Thread.sleep(30000);// Get the updated job status.response=dlpServiceClient.getDlpJob(response.getName());// Check if the timeout duration has exceeded.longelapsedTime=System.currentTimeMillis()-startTime;if(TimeUnit.MILLISECONDS.toMinutes(elapsedTime)>=TIMEOUT_MINUTES){System.out.printf("Job did not complete within %d minutes.%n",TIMEOUT_MINUTES);break;}}// Print the results.System.out.println("Job status: "+response.getState());System.out.println("Job name: "+response.getName());InspectDataSourceDetails.Resultresult=response.getInspectDetails().getResult();System.out.println("Findings: ");for(InfoTypeStatsinfoTypeStat:result.getInfoTypeStatsList()){System.out.print("\tInfo type: "+infoTypeStat.getInfoType().getName());System.out.println("\tCount: "+infoTypeStat.getCount());}}}}

Node.js

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under.// const projectId = "your-project-id";// The project ID the table is stored under// This may or (for public datasets) may not equal the calling project ID// const dataProjectId = 'my-project';// The ID of the dataset to inspect, e.g. 'my_dataset'// const datasetId = 'my_dataset';// The ID of the table to inspect, e.g. 'my_table'// const tableId = 'my_table';asyncfunctioninspectBigQuerySendToScc(){// Specify the storage configuration object with big query table.conststorageItem={bigQueryOptions:{tableReference:{projectId:dataProjectId,datasetId:datasetId,tableId:tableId,},},};// Specify the type of info the inspection will look for.constinfoTypes=[{name:'EMAIL_ADDRESS'},{name:'PERSON_NAME'},{name:'LOCATION'},{name:'PHONE_NUMBER'},];// Construct inspect configuration.constinspectConfig={infoTypes:infoTypes,includeQuote:true,minLikelihood:DLP.protos.google.privacy.dlp.v2.Likelihood.UNLIKELY,limits:{maxFindingsPerItem:100,},};// Specify the action that is triggered when the job completes.constaction={publishSummaryToCscc:{enable:true,},};// Configure the inspection job we want the service to perform.constinspectJobConfig={inspectConfig:inspectConfig,storageConfig:storageItem,actions:[action],};// Construct the job creation request to be sent by the client.constrequest={parent:`projects/${projectId}/locations/global`,inspectJob:inspectJobConfig,};// Send the job creation request and process the response.const[jobsResponse]=awaitdlp.createDlpJob(request);constjobName=jobsResponse.name;// Waiting for a maximum of 15 minutes for the job to get complete.letjob;letnumOfAttempts=30;while(numOfAttempts >0){// Fetch DLP Job status[job]=awaitdlp.getDlpJob({name:jobName});// Check if the job has completed.if(job.state==='DONE'){break;}if(job.state==='FAILED'){console.log('Job Failed, Please check the configuration.');return;}// Sleep for a short duration before checking the job status again.awaitnewPromise(resolve=>{setTimeout(()=>resolve(),30000);});numOfAttempts-=1;}// Print out the results.constinfoTypeStats=job.inspectDetails.result.infoTypeStats;if(infoTypeStats.length >0){infoTypeStats.forEach(infoTypeStat=>{console.log(`  Found${infoTypeStat.count} instance(s) of infoType${infoTypeStat.infoType.name}.`);});}else{console.log('No findings.');}}awaitinspectBigQuerySendToScc();

PHP

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

use Google\Cloud\Dlp\V2\Action;use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;use Google\Cloud\Dlp\V2\BigQueryOptions;use Google\Cloud\Dlp\V2\BigQueryTable;use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\CreateDlpJobRequest;use Google\Cloud\Dlp\V2\DlpJob\JobState;use Google\Cloud\Dlp\V2\GetDlpJobRequest;use Google\Cloud\Dlp\V2\InfoType;use Google\Cloud\Dlp\V2\InspectConfig;use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;use Google\Cloud\Dlp\V2\InspectJobConfig;use Google\Cloud\Dlp\V2\Likelihood;use Google\Cloud\Dlp\V2\StorageConfig;/** * (BIGQUERY) Send Cloud DLP scan results to Security Command Center. * Using Cloud Data Loss Prevention to scan specific Google Cloud resources and send data to Security Command Center. * * @param string $callingProjectId  The project ID to run the API call under. * @param string $projectId         The ID of the Project. * @param string $datasetId         The ID of the BigQuery Dataset. * @param string $tableId           The ID of the BigQuery Table to be inspected. */function inspect_bigquery_send_to_scc(    // TODO(developer): Replace sample parameters before running the code.    string $callingProjectId,    string $projectId,    string $datasetId,    string $tableId): void {    // Instantiate a client.    $dlp = new DlpServiceClient();    // Construct the items to be inspected.    $bigqueryTable = (new BigQueryTable())        ->setProjectId($projectId)        ->setDatasetId($datasetId)        ->setTableId($tableId);    $bigQueryOptions = (new BigQueryOptions())        ->setTableReference($bigqueryTable);    $storageConfig = (new StorageConfig())        ->setBigQueryOptions(($bigQueryOptions));    // Specify the type of info the inspection will look for.    $infoTypes = [        (new InfoType())->setName('EMAIL_ADDRESS'),        (new InfoType())->setName('PERSON_NAME'),        (new InfoType())->setName('LOCATION'),        (new InfoType())->setName('PHONE_NUMBER')    ];    // Specify how the content should be inspected.    $inspectConfig = (new InspectConfig())        ->setMinLikelihood(likelihood::UNLIKELY)        ->setLimits((new FindingLimits())            ->setMaxFindingsPerRequest(100))        ->setInfoTypes($infoTypes)        ->setIncludeQuote(true);    // Specify the action that is triggered when the job completes.    $action = (new Action())        ->setPublishSummaryToCscc(new PublishSummaryToCscc());    // Configure the inspection job we want the service to perform.    $inspectJobConfig = (new InspectJobConfig())        ->setInspectConfig($inspectConfig)        ->setStorageConfig($storageConfig)        ->setActions([$action]);    // Send the job creation request and process the response.    $parent = "projects/$callingProjectId/locations/global";    $createDlpJobRequest = (new CreateDlpJobRequest())        ->setParent($parent)        ->setInspectJob($inspectJobConfig);    $job = $dlp->createDlpJob($createDlpJobRequest);    $numOfAttempts = 10;    do {        printf('Waiting for job to complete' . PHP_EOL);        sleep(10);        $getDlpJobRequest = (new GetDlpJobRequest())            ->setName($job->getName());        $job = $dlp->getDlpJob($getDlpJobRequest);        if ($job->getState() == JobState::DONE) {            break;        }        $numOfAttempts--;    } while ($numOfAttempts > 0);    // Print finding counts.    printf('Job %s status: %s' . PHP_EOL, $job->getName(), JobState::name($job->getState()));    switch ($job->getState()) {        case JobState::DONE:            $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();            if (count($infoTypeStats) === 0) {                printf('No findings.' . PHP_EOL);            } else {                foreach ($infoTypeStats as $infoTypeStat) {                    printf(                        '  Found %s instance(s) of infoType %s' . PHP_EOL,                        $infoTypeStat->getCount(),                        $infoTypeStat->getInfoType()->getName()                    );                }            }            break;        case JobState::FAILED:            printf('Job %s had errors:' . PHP_EOL, $job->getName());            $errors = $job->getErrors();            foreach ($errors as $error) {                var_dump($error->getDetails());            }            break;        case JobState::PENDING:            printf('Job has not completed. Consider a longer timeout or an asynchronous execution model' . PHP_EOL);            break;        default:            printf('Unexpected job state. Most likely, the job is either running or has not yet started.');    }}

Python

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importtimefromtypingimportListimportgoogle.cloud.dlpdefinspect_bigquery_send_to_scc(project:str,info_types:List[str],max_findings:int=100,)->None:"""    Uses the Data Loss Prevention API to inspect public bigquery dataset    and send the results to Google Security Command Center.    Args:        project: The Google Cloud project id to use as a parent resource.        info_types: A list of strings representing infoTypes to inspect for.            A full list of infoType categories can be fetched from the API.        max_findings: The maximum number of findings to report; 0 = no maximum    """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Prepare info_types by converting the list of strings into a list of# dictionaries.info_types=[{"name":info_type}forinfo_typeininfo_types]# Construct the configuration dictionary.inspect_config={"info_types":info_types,"min_likelihood":google.cloud.dlp_v2.Likelihood.UNLIKELY,"limits":{"max_findings_per_request":max_findings},"include_quote":True,}# Construct a Cloud Storage Options dictionary with the big query options.storage_config={"big_query_options":{"table_reference":{"project_id":"bigquery-public-data","dataset_id":"usa_names","table_id":"usa_1910_current",}}}# Tell the API where to send a notification when the job is complete.actions=[{"publish_summary_to_cscc":{}}]# Construct the job definition.job={"inspect_config":inspect_config,"storage_config":storage_config,"actions":actions,}# Convert the project id into a full resource id.parent=f"projects/{project}"# Call the API.response=dlp.create_dlp_job(request={"parent":parent,"inspect_job":job,})print(f"Inspection Job started :{response.name}")job_name=response.name# Waiting for a maximum of 15 minutes for the job to get complete.no_of_attempts=30whileno_of_attempts >0:# Get the DLP job status.job=dlp.get_dlp_job(request={"name":job_name})# Check if the job has completed.ifjob.state==google.cloud.dlp_v2.DlpJob.JobState.DONE:breakifjob.state==google.cloud.dlp_v2.DlpJob.JobState.FAILED:print("Job Failed, Please check the configuration.")return# Sleep for a short duration before checking the job status again.time.sleep(30)no_of_attempts-=1# Print out the results.print(f"Job name:{job.name}")result=job.inspect_details.resultifresult.info_type_stats:forstatsinresult.info_type_stats:print(f"Info type:{stats.info_type.name}")print(f"Count:{stats.count}")else:print("No findings.")

Code samples: inspect a Datastore kind

This example demonstrates how to use the DLP API to create aninspection job that inspects a Datastore kind and sends findingsto Security Command Center.

C#

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

usingSystem.Collections.Generic;usingSystem.Linq;usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;usingstaticGoogle.Cloud.Dlp.V2.InspectConfig.Types;publicclassInspectDataStoreJobWithSCCIntegration{publicstaticDlpJobSendInspectDatastoreToSCC(stringprojectId,stringkindName,stringnamespaceId,LikelihoodminLikelihood=Likelihood.Unlikely,IEnumerable<InfoType>infoTypes=null){// Instantiate the dlp client.vardlp=DlpServiceClient.Create();// Specify the Datastore entity to be inspected and construct the storage// config. The NamespaceId is to be used for partition entity and the datastore kind defining// a data set.varstorageConfig=newStorageConfig{DatastoreOptions=newDatastoreOptions{Kind=newKindExpression{Name=kindName},PartitionId=newPartitionId{NamespaceId=namespaceId,ProjectId=projectId}}};// Specify the type of info to be inspected and construct the inspect config.varinspectConfig=newInspectConfig{InfoTypes={infoTypes??newInfoType[]{newInfoType{Name="EMAIL_ADDRESS"},newInfoType{Name="PERSON_NAME"},newInfoType{Name="LOCATION"},newInfoType{Name="PHONE_NUMBER"}}},IncludeQuote=true,MinLikelihood=minLikelihood,Limits=newFindingLimits{MaxFindingsPerRequest=100}};// Construct the SCC action which will be performed after inspecting the datastore.varactions=newAction[]{newAction{PublishSummaryToCscc=newAction.Types.PublishSummaryToCscc()}};// Construct the inspect job config using storage config, inspect config and action.varinspectJob=newInspectJobConfig{StorageConfig=storageConfig,InspectConfig=inspectConfig,Actions={actions}};// Construct the request.varrequest=newCreateDlpJobRequest{ParentAsLocationName=newLocationName(projectId,"global"),InspectJob=inspectJob};// Call the API.DlpJobresponse=dlp.CreateDlpJob(request);returnresponse;}}

Go

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// inspectDataStoreSendToScc inspects sensitive data in a Datastore// and sends the results to Google Cloud Security Command Center (SCC).funcinspectDataStoreSendToScc(wio.Writer,projectID,datastoreNamespace,datastoreKindstring)error{// projectID := "my-project-id"// datastoreNamespace := "your-datastore-namespace"// datastoreKind := "your-datastore-kind"ctx:=context.Background()// Initialize a client once and reuse it to send multiple requests. Clients// are safe to use across goroutines. When the client is no longer needed,// call the Close method to cleanup its resources.client,err:=dlp.NewClient(ctx)iferr!=nil{returnerr}// Closing the client safely cleans up background resources.deferclient.Close()// Specify the Datastore entity to be inspected.partitionId:=&dlppb.PartitionId{ProjectId:projectID,NamespaceId:datastoreNamespace,}// kindExpr represents an expression specifying a kind or range of kinds for data inspection in DLP.kindExpression:=&dlppb.KindExpression{Name:datastoreKind,}// Specify datastoreOptions so that It holds the configuration options for inspecting data in// Google Cloud Datastore.datastoreOptions:=&dlppb.DatastoreOptions{PartitionId:partitionId,Kind:kindExpression,}// Specify the storageConfig to represents the configuration settings for inspecting data// in different storage types, such as BigQuery and Cloud Storage.storageConfig:=&dlppb.StorageConfig{Type:&dlppb.StorageConfig_DatastoreOptions{DatastoreOptions:datastoreOptions,},}// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesinfoTypes:=[]*dlppb.InfoType{{Name:"EMAIL_ADDRESS"},{Name:"PERSON_NAME"},{Name:"LOCATION"},{Name:"PHONE_NUMBER"},}// The minimum likelihood required before returning a match.minLikelihood:=dlppb.Likelihood_UNLIKELY// The maximum number of findings to report (0 = server maximum).findingLimits:=&dlppb.InspectConfig_FindingLimits{MaxFindingsPerItem:100,}inspectConfig:=&dlppb.InspectConfig{InfoTypes:infoTypes,MinLikelihood:minLikelihood,Limits:findingLimits,IncludeQuote:true,}// Specify the action that is triggered when the job completes.action:=&dlppb.Action{Action:&dlppb.Action_PublishSummaryToCscc_{PublishSummaryToCscc:&dlppb.Action_PublishSummaryToCscc{},},}// Configure the inspection job we want the service to perform.inspectJobConfig:=&dlppb.InspectJobConfig{StorageConfig:storageConfig,InspectConfig:inspectConfig,Actions:[]*dlppb.Action{action,},}// Create the request for the job configured above.req:=&dlppb.CreateDlpJobRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),Job:&dlppb.CreateDlpJobRequest_InspectJob{InspectJob:inspectJobConfig,},}// Send the request.resp,err:=client.CreateDlpJob(ctx,req)iferr!=nil{returnerr}// Print the resultfmt.Fprintf(w,"Job created successfully: %v",resp.Name)returnnil}

Java

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.Action;importcom.google.privacy.dlp.v2.CreateDlpJobRequest;importcom.google.privacy.dlp.v2.DatastoreOptions;importcom.google.privacy.dlp.v2.DlpJob;importcom.google.privacy.dlp.v2.InfoType;importcom.google.privacy.dlp.v2.InfoTypeStats;importcom.google.privacy.dlp.v2.InspectConfig;importcom.google.privacy.dlp.v2.InspectDataSourceDetails;importcom.google.privacy.dlp.v2.InspectJobConfig;importcom.google.privacy.dlp.v2.KindExpression;importcom.google.privacy.dlp.v2.Likelihood;importcom.google.privacy.dlp.v2.LocationName;importcom.google.privacy.dlp.v2.PartitionId;importcom.google.privacy.dlp.v2.StorageConfig;importjava.io.IOException;importjava.util.List;importjava.util.concurrent.TimeUnit;importjava.util.stream.Collectors;importjava.util.stream.Stream;publicclassInspectDatastoreSendToScc{privatestaticfinalintTIMEOUT_MINUTES=15;publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.// The Google Cloud project id to use as a parent resource.StringprojectId="your-project-id";// The namespace specifier to be used for the partition entity.StringdatastoreNamespace="your-datastore-namespace";// The datastore kind defining a data set.StringdatastoreKind="your-datastore-kind";inspectDatastoreSendToScc(projectId,datastoreNamespace,datastoreKind);}// Creates a DLP Job to scan the sample data stored in a DataStore table and save its scan results// to Security Command Center.publicstaticvoidinspectDatastoreSendToScc(StringprojectId,StringdatastoreNamespace,StringdatastoreKind)throwsIOException,InterruptedException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Specify the Datastore entity to be inspected.PartitionIdpartitionId=PartitionId.newBuilder().setProjectId(projectId).setNamespaceId(datastoreNamespace).build();KindExpressionkindExpression=KindExpression.newBuilder().setName(datastoreKind).build();DatastoreOptionsdatastoreOptions=DatastoreOptions.newBuilder().setKind(kindExpression).setPartitionId(partitionId).build();StorageConfigstorageConfig=StorageConfig.newBuilder().setDatastoreOptions(datastoreOptions).build();// Specify the type of info the inspection will look for.List<InfoType>infoTypes=Stream.of("EMAIL_ADDRESS","PERSON_NAME","LOCATION","PHONE_NUMBER").map(it->InfoType.newBuilder().setName(it).build()).collect(Collectors.toList());// The minimum likelihood required before returning a match.LikelihoodminLikelihood=Likelihood.UNLIKELY;// The maximum number of findings to report (0 = server maximum)InspectConfig.FindingLimitsfindingLimits=InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();// Specify how the content should be inspected.InspectConfiginspectConfig=InspectConfig.newBuilder().addAllInfoTypes(infoTypes).setIncludeQuote(true).setMinLikelihood(minLikelihood).setLimits(findingLimits).build();// Specify the action that is triggered when the job completes.Action.PublishSummaryToCsccpublishSummaryToCscc=Action.PublishSummaryToCscc.getDefaultInstance();Actionaction=Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();// Configure the inspection job we want the service to perform.InspectJobConfiginspectJobConfig=InspectJobConfig.newBuilder().setInspectConfig(inspectConfig).setStorageConfig(storageConfig).addActions(action).build();// Construct the job creation request to be sent by the client.CreateDlpJobRequestcreateDlpJobRequest=CreateDlpJobRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).setInspectJob(inspectJobConfig).build();// Send the job creation request and process the response.DlpJobresponse=dlpServiceClient.createDlpJob(createDlpJobRequest);// Get the current time.longstartTime=System.currentTimeMillis();// Check if the job state is DONE.while(response.getState()!=DlpJob.JobState.DONE){// Sleep for 30 second.Thread.sleep(30000);// Get the updated job status.response=dlpServiceClient.getDlpJob(response.getName());// Check if the timeout duration has exceeded.longelapsedTime=System.currentTimeMillis()-startTime;if(TimeUnit.MILLISECONDS.toMinutes(elapsedTime)>=TIMEOUT_MINUTES){System.out.printf("Job did not complete within %d minutes.%n",TIMEOUT_MINUTES);break;}}// Print the results.System.out.println("Job status: "+response.getState());System.out.println("Job name: "+response.getName());InspectDataSourceDetails.Resultresult=response.getInspectDetails().getResult();System.out.println("Findings: ");for(InfoTypeStatsinfoTypeStat:result.getInfoTypeStatsList()){System.out.print("\tInfo type: "+infoTypeStat.getInfoType().getName());System.out.println("\tCount: "+infoTypeStat.getCount());}}}}

Node.js

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under.// const projectId = "your-project-id";// Datastore namespace// const datastoreNamespace = 'datastore-namespace';// Datastore kind// const datastoreKind = 'datastore-kind';asyncfunctioninspectDatastoreSendToScc(){// Specify the storage configuration object with datastore.conststorageConfig={datastoreOptions:{kind:{name:datastoreKind,},partitionId:{projectId:projectId,namespaceId:datastoreNamespace,},},};// Construct the info types to look for in the datastore.constinfoTypes=[{name:'EMAIL_ADDRESS'},{name:'PERSON_NAME'},{name:'LOCATION'},{name:'PHONE_NUMBER'},];// Construct the inspection configuration.constinspectConfig={infoTypes:infoTypes,minLikelihood:DLP.protos.google.privacy.dlp.v2.Likelihood.UNLIKELY,limits:{maxFindingsPerItem:100,},includeQuote:true,};// Specify the action that is triggered when the job completesconstaction={publishSummaryToCscc:{enable:true},};// Configure the inspection job we want the service to perform.constinspectJobConfig={inspectConfig:inspectConfig,storageConfig:storageConfig,actions:[action],};// Construct the job creation request to be sent by the client.constrequest={parent:`projects/${projectId}/locations/global`,inspectJob:inspectJobConfig,};// Send the job creation request and process the response.const[jobsResponse]=awaitdlp.createDlpJob(request);constjobName=jobsResponse.name;// Waiting for a maximum of 15 minutes for the job to get complete.letjob;letnumOfAttempts=30;while(numOfAttempts >0){// Fetch DLP Job status[job]=awaitdlp.getDlpJob({name:jobName});// Check if the job has completed.if(job.state==='DONE'){break;}if(job.state==='FAILED'){console.log('Job Failed, Please check the configuration.');return;}// Sleep for a short duration before checking the job status again.awaitnewPromise(resolve=>{setTimeout(()=>resolve(),30000);});numOfAttempts-=1;}// Print out the results.constinfoTypeStats=job.inspectDetails.result.infoTypeStats;if(infoTypeStats.length >0){infoTypeStats.forEach(infoTypeStat=>{console.log(`Found${infoTypeStat.count} instance(s) of infoType${infoTypeStat.infoType.name}.`);});}else{console.log('No findings.');}}awaitinspectDatastoreSendToScc();

PHP

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

use Google\Cloud\Dlp\V2\Action;use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\CreateDlpJobRequest;use Google\Cloud\Dlp\V2\DatastoreOptions;use Google\Cloud\Dlp\V2\DlpJob\JobState;use Google\Cloud\Dlp\V2\GetDlpJobRequest;use Google\Cloud\Dlp\V2\InfoType;use Google\Cloud\Dlp\V2\InspectConfig;use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;use Google\Cloud\Dlp\V2\InspectJobConfig;use Google\Cloud\Dlp\V2\KindExpression;use Google\Cloud\Dlp\V2\Likelihood;use Google\Cloud\Dlp\V2\PartitionId;use Google\Cloud\Dlp\V2\StorageConfig;/** * (DATASTORE) Send Cloud DLP scan results to Security Command Center. * Using Cloud Data Loss Prevention to scan specific Google Cloud resources and send data to Security Command Center. * * @param string $callingProjectId  The project ID to run the API call under. * @param string $kindName          Datastore kind name to be inspected. * @param string $namespaceId       Namespace Id to be inspected. */function inspect_datastore_send_to_scc(    string $callingProjectId,    string $kindName,    string $namespaceId): void {    // Instantiate a client.    $dlp = new DlpServiceClient();    // Construct the items to be inspected.    $datastoreOptions = (new DatastoreOptions())        ->setKind((new KindExpression())            ->setName($kindName))        ->setPartitionId((new PartitionId())            ->setNamespaceId($namespaceId)            ->setProjectId($callingProjectId));    $storageConfig = (new StorageConfig())        ->setDatastoreOptions(($datastoreOptions));    // Specify the type of info the inspection will look for.    $infoTypes = [        (new InfoType())->setName('EMAIL_ADDRESS'),        (new InfoType())->setName('PERSON_NAME'),        (new InfoType())->setName('LOCATION'),        (new InfoType())->setName('PHONE_NUMBER')    ];    // Specify how the content should be inspected.    $inspectConfig = (new InspectConfig())        ->setMinLikelihood(likelihood::UNLIKELY)        ->setLimits((new FindingLimits())            ->setMaxFindingsPerRequest(100))        ->setInfoTypes($infoTypes)        ->setIncludeQuote(true);    // Specify the action that is triggered when the job completes.    $action = (new Action())        ->setPublishSummaryToCscc(new PublishSummaryToCscc());    // Construct inspect job config to run.    $inspectJobConfig = (new InspectJobConfig())        ->setInspectConfig($inspectConfig)        ->setStorageConfig($storageConfig)        ->setActions([$action]);    // Send the job creation request and process the response.    $parent = "projects/$callingProjectId/locations/global";    $createDlpJobRequest = (new CreateDlpJobRequest())        ->setParent($parent)        ->setInspectJob($inspectJobConfig);    $job = $dlp->createDlpJob($createDlpJobRequest);    $numOfAttempts = 10;    do {        printf('Waiting for job to complete' . PHP_EOL);        sleep(10);        $getDlpJobRequest = (new GetDlpJobRequest())            ->setName($job->getName());        $job = $dlp->getDlpJob($getDlpJobRequest);        if ($job->getState() == JobState::DONE) {            break;        }        $numOfAttempts--;    } while ($numOfAttempts > 0);    // Print finding counts.    printf('Job %s status: %s' . PHP_EOL, $job->getName(), JobState::name($job->getState()));    switch ($job->getState()) {        case JobState::DONE:            $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats();            if (count($infoTypeStats) === 0) {                printf('No findings.' . PHP_EOL);            } else {                foreach ($infoTypeStats as $infoTypeStat) {                    printf(                        '  Found %s instance(s) of infoType %s' . PHP_EOL,                        $infoTypeStat->getCount(),                        $infoTypeStat->getInfoType()->getName()                    );                }            }            break;        case JobState::FAILED:            printf('Job %s had errors:' . PHP_EOL, $job->getName());            $errors = $job->getErrors();            foreach ($errors as $error) {                var_dump($error->getDetails());            }            break;        case JobState::PENDING:            printf('Job has not completed. Consider a longer timeout or an asynchronous execution model' . PHP_EOL);            break;        default:            printf('Unexpected job state. Most likely, the job is either running or has not yet started.');    }}

Python

To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.

To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importtimefromtypingimportListimportgoogle.cloud.dlpdefinspect_datastore_send_to_scc(project:str,datastore_project:str,kind:str,info_types:List[str],namespace_id:str=None,max_findings:int=100,)->None:"""    Uses the Data Loss Prevention API to inspect Datastore data and    send the results to Google Security Command Center.    Args:        project: The Google Cloud project id to use as a parent resource.        datastore_project: The Google Cloud project id of the target Datastore.        kind: The kind of the Datastore entity to inspect, e.g. 'Person'.        info_types: A list of strings representing infoTypes to inspect for.            A full list of infoType categories can be fetched from the API.        namespace_id: The namespace of the Datastore document, if applicable.        max_findings: The maximum number of findings to report; 0 = no maximum    """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Prepare info_types by converting the list of strings into a list of# dictionaries.info_types=[{"name":info_type}forinfo_typeininfo_types]# Construct the configuration dictionary.inspect_config={"info_types":info_types,"min_likelihood":google.cloud.dlp_v2.Likelihood.UNLIKELY,"limits":{"max_findings_per_request":max_findings},"include_quote":True,}# Construct a cloud_storage_options dictionary with datastore options.storage_config={"datastore_options":{"partition_id":{"project_id":datastore_project,"namespace_id":namespace_id,},"kind":{"name":kind},}}# Tell the API where to send a notification when the job is complete.actions=[{"publish_summary_to_cscc":{}}]# Construct the job definition.job={"inspect_config":inspect_config,"storage_config":storage_config,"actions":actions,}# Convert the project id into a full resource id.parent=f"projects/{project}"# Call the APIresponse=dlp.create_dlp_job(request={"parent":parent,"inspect_job":job,})print(f"Inspection Job started :{response.name}")job_name=response.name# Waiting for a maximum of 15 minutes for the job to get complete.no_of_attempts=30whileno_of_attempts >0:# Get the DLP job status.job=dlp.get_dlp_job(request={"name":job_name})# Check if the job has completed.ifjob.state==google.cloud.dlp_v2.DlpJob.JobState.DONE:breakifjob.state==google.cloud.dlp_v2.DlpJob.JobState.FAILED:print("Job Failed, Please check the configuration.")return# Sleep for a short duration before checking the job status again.time.sleep(30)no_of_attempts-=1# Print out the results.print(f"Job name:{job.name}")result=job.inspect_details.resultifresult.info_type_stats:forstatsinresult.info_type_stats:print(f"Info type:{stats.info_type.name}")print(f"Count:{stats.count}")else:print("No findings.")

View Sensitive Data Protection scan results in Security Command Center

Because you instructed Sensitive Data Protection to send its inspection job resultsto Security Command Center, you can now view the results of the inspection job inSecurity Command Center:

  1. In the Google Cloud console, go to the Security Command CenterFindings page.

    Go to Findings

  2. Select the organization for which you enabled Security Command Center.
  3. In theQuery editor field, enter the following to query for findings fromSensitive Data Protection.

    state="ACTIVE"AND NOT mute="MUTED"AND (parent_display_name="Sensitive Data Protection" OR parent_display_name="Cloud Data Loss Prevention")

    For more information about the query editor, seeEdit a findings query intheGoogle Cloud console.

    If any findings were sent from Sensitive Data Protection, the findings appear inthe findings list. The list includes all findings fromSensitive Data Protection, which can include findings from inspection jobs anddiscovery (data profiling) operations.

The instructions provided in this guide only turn on some ofSensitive Data Protection's built-in detectors.

Clean up

To avoid incurring charges to your Google Cloud account for the resourcesused in this topic:

Delete the project

The easiest way to eliminate billing is to delete the project you createdwhile following the instructions provided in this topic.

    Caution: Deleting a project has the following effects:
    • Everything in the project is deleted. If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.
    • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as anappspot.com URL, delete selected resources inside the project instead of deleting the whole project.

    If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects can help you avoid exceeding project quota limits.

  1. In the Google Cloud console, go to theManage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then clickDelete.
  3. In the dialog, type the project ID, and then clickShut down to delete the project.

If you delete your project using this method, the Sensitive Data Protection joband Cloud Storage bucket you created were also deleted. It's notnecessary to follow the instructions in the following sections.

Delete the Sensitive Data Protection job

If you scanned your own data, you need to delete only the inspection jobthat you created:

  1. Go to APIs Explorer on the reference page for thedlpJobs.delete method by clicking the following button:

    Open APIs Explorer

  2. In thename box, type the name of the job from the JSON response to the scan request, which has the following form:
    projects/PROJECT_ID/dlpJobs/JOB_ID
    The job ID is in the form ofi-1234567890123456789.

If you created additional inspection jobs or if you want to make sure you'vedeleted the job successfully, you can list all existing jobs:

  1. Go to APIs Explorer on the reference page for thedlpJobs.list method by clicking the following button:

    Open APIs Explorer

  2. In theparent box, type the project identifier in the following form:
    projects/PROJECT_ID
  3. ClickExecute.

If there are no jobs listed in the response, you've deleted all jobs. Ifjobs are listed in the response, repeat the deletion procedure for thosejobs.

Delete the Cloud Storage bucket

If you created a new Cloud Storage bucket to hold sample data, delete thebucket:

  1. Open theCloud Storage browser.

    Open Cloud Storage

  2. In the Cloud Storage browser, select the checkbox next to the name of the bucket you created, and then clickDelete.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.