Creating and scheduling Sensitive Data Protection inspection jobs Stay organized with collections Save and categorize content based on your preferences.
This topic describes in detail how to create a Sensitive Data Protection inspection job,and how to schedule recurring inspection jobs by creating a job trigger. For aquick walkthrough of how to create a new job trigger using theSensitive Data Protection UI, seeQuickstart: Creating a Sensitive Data Protectionjob trigger.
About inspection jobs and job triggers
When Sensitive Data Protection performs an inspection scan to identify sensitivedata, each scan runs as ajob. Sensitive Data Protection creates and runs a jobresource whenever you tell it to inspect your Google Cloud storage repositories,including Cloud Storage buckets, BigQuery tables,Datastore kinds, and external data.
You schedule Sensitive Data Protection inspection scan jobs by creatingjobtriggers. A job trigger automates the creation of Sensitive Data Protection jobson a periodic basis, and can also be run on demand.
To learn more about jobs and job triggers in Sensitive Data Protection, see theJobs and job triggers conceptual page.
Note: Prematurely canceling an operation midway through a job still incurs costs for the portion of the job that was completed. For more information about billing, seeSensitive Data Protection pricing.
Create a new inspection job
To create a new Sensitive Data Protection inspection job:
Console
In the Sensitive Data Protection section of the Google Cloud console, go totheCreate job or job trigger page.
Go to Create job or job trigger
TheCreate job or job trigger page contains the following sections:
Choose input data
Name
Enter a name for the job. You can use letters, numbers, and hyphens. Namingyour job is optional. If you don't enter a name, Sensitive Data Protection willgive the job a unique number identifier.
Location
From theStorage type menu, choose the kind of repository that stores thedata you want to scan:
- Cloud Storage: Either enter the URL of the bucket you want to scan, or chooseInclude/exclude from theLocation type menu, and then clickBrowse to navigate to the bucket or subfolder you want to scan. Select theScan folder recursively checkbox to scan the specified directory and all contained directories. Leave it unselected to scan only the specified directory and no deeper.
- BigQuery: Enter the identifiers for the project, dataset, and table that you want to scan.
- Datastore: Enter the identifiers for the project, namespace (optional), and kind that you want to scan.
- Hybrid: You can add required labels, optional labels, and options for handling tabular data. For more information, seeTypes of metadata you can provide.
Sampling
Sampling is an optional way to save resources if you have a very large amount ofdata.
Note: Sampling isn't supported on jobs and job triggers that are configuredto de-identify findings.UnderSampling, you can choose whether to scan all the selected data or tosample the data by scanning a certain percentage. Sampling works differentlydepending on the type of storage repository you're scanning:
- For BigQuery, you can sample a subset of the total selectedrows, corresponding to the percentage of files you specify to include in thescan.
- For Cloud Storage, if any file exceeds the size specified in theMaxbyte size to scan per file, Sensitive Data Protection scans it up to thatmaximum file size and then moves on to the next file.
To turn on sampling, choose one of the following options from the first menu:
- Start sampling from top: Sensitive Data Protection starts the partial scanat the beginning of the data. For BigQuery, this starts thescan at the first row. For Cloud Storage, this starts the scan at thebeginning of each file, and stops scanning once Sensitive Data Protection hasscanned up to any specified maximum file size.
- Start sampling from random start: Sensitive Data Protection starts thepartial scan at a random location within the data. ForBigQuery, this starts the scan at a random row. ForCloud Storage, this setting only applies to files that exceed anyspecified maximum size. Sensitive Data Protection scans files under the maximumfile size in their entirety, and scans files above the maximum file size upto the maximum.
To perform a partial scan, you must also choose what percentage of the data youwant to scan. Use the slider to set the percentage.
You can also narrow the files or records to scan by date. To learn how,seeSchedule, later in this topic.
Note: ClickingCreate here creates a job that runs once, immediately. If youwant to create a job trigger that will run on a periodic schedule, clickContinue. You must set aschedule to create a job trigger.Advanced configuration
When you create a job for a scan of Cloud Storage buckets orBigQuery tables, you can narrow your search by specifying anadvanced configuration. Specifically, you can configure:
- Files (Cloud Storage only): The file types to scan for, whichinclude text, binary, and image files.
- Identifying fields (BigQuery only):Unique row identifiers within the table.
- For Cloud Storage, if any file exceeds the size specified in theMaxbyte size to scan per file, Sensitive Data Protection scans it up to thatmaximum file size and then moves on to the next file.
To turn on sampling, choose what percentage of the data you want to scan. Usethe slider to set the percentage. Then, choose one of the following optionsfrom the first menu:
- Start sampling from top: Sensitive Data Protection starts the partial scanat the beginning of the data. For BigQuery, this starts thescan at the first row. For Cloud Storage, this starts the scan at thebeginning of each file, and stops scanning once Sensitive Data Protection hasscanned up to any specified maximum file size (see above).
- Start sampling from random start: Sensitive Data Protection starts thepartial scan at a random location within the data. ForBigQuery, this starts the scan at a random row. ForCloud Storage, this setting only applies to files that exceed anyspecified maximum size. Sensitive Data Protection scans files under the maximumfile size in their entirety, and scans files above the maximum file size upto the maximum.
Files
For files stored in Cloud Storage, you can specify the types to include inyour scan underFiles.
You can choose from binary, text, image, CSV, TSV, Microsoft Word, Microsoft Excel,Microsoft Powerpoint, PDF, and Apache Avro files. For an exhaustive list of fileextensions that Sensitive Data Protection can scan in Cloud Storage buckets,seeFileType.ChoosingBinary causes Sensitive Data Protection to scan files oftypes that are unrecognized.
Identifying fields
For tables in BigQuery, in theIdentifying fields field, youcan direct Sensitive Data Protection to include the values of the table's primarykey columns in the results. Doing so lets you link the findings back to thetable rows that contain them.
Enter the names of the columns that uniquely identify each row within thetable. If necessary, use dot notation to specify nested fields. You canadd as many fields as you want.
You must also turn on theSave to BigQuery action to export the findings toBigQuery. When the findings are exported to BigQuery, each findingcontains the respective values of the identifying fields. For more information, seeidentifyingFields.
Configure detection
TheConfigure detection section is where you specify the types of sensitivedata you want to scan for. Completing this section is optional. If you skipthis section, Sensitive Data Protection will scan your data for a default set ofinfoTypes.
Template
You can optionally use a Sensitive Data Protection template to reuse configurationinformation you've specified previously.
If you have already created a template that you want to use, click in theTemplate name field to see a list of existing inspection templates. Chooseor type the name of the template you want to use.
For more information about creating templates, seeCreatingSensitive Data Protection inspection templates.
InfoTypes
InfoType detectors find sensitive data of a certain type. For example, theSensitive Data ProtectionUS_SOCIAL_SECURITY_NUMBER built-in infoType detectorfinds US Social Security numbers. In addition to the built-in infoTypedetectors, you can create your own custom infoType detectors.
UnderInfoTypes, choose the infoType detector that corresponds to a datatype you want to scan for. We don't recommend leaving this section blank. Doingso causes Sensitive Data Protection to scan your data with a defaultset of infoTypes, which might include infoTypes that you don't need. For moreinformation about each detector, seeInfoType detectorreference.
For more information about how to manage built-in and custom infoTypes inthis section, seeManage infoTypes through the Google Cloud console.
Inspection rulesets
Inspection rulesets allow you to customize both built-in and custom infoType detectors using context rules. The two types of inspection rules are:
- Exclusion rules, which help exclude false or unwanted findings.
- Hotword rules, which help detect additional findings.
To add a new ruleset, first specify one or more built-in or custom infoType detectors in theInfoTypes section. These are the infoType detectors that your rulesets will be modifying. Then, do the following:
- Click in theChoose infoTypes field. The infoType or infoTypes you specified previously appear below the field in a menu.
- Choose an infoType from the menu, and then clickAdd rule. A menu appears with the two optionsHotword rule andExclusion rule.
For hotword rules, chooseHotword rules. Then, do the following:
- In theHotword field, enter a regular expression that Sensitive Data Protection should look for.
- From theHotword proximity menu, choose whether the hotword you entered is found before or after the chosen infoType.
- InHotword distance from infoType, enter the approximate number of characters between the hotword and the chosen infoType.
- InConfidence level adjustment, choose whether to assign matches a fixedlikelihood level, or to increase or decrease the default likelihood level by a certain amount.
For exclusion rules, chooseExclusion rules. Then, do the following:
- In theExclude field, enter a regular expression (regex) that Sensitive Data Protection should look for.
- From theMatching type menu, choose one of the following:
- Full match: The finding must completely match the regex.
- Partial match: A substring of the finding can match the regex.
- Inverse match: The finding doesn't match the regex.
You can add additional hotword or exclusion rules and rulesets to further refine your scan results.
Confidence threshold
Every time Sensitive Data Protection detects a potential match for sensitive data,it assigns it alikelihood value on a scale from "Very unlikely"to "Very likely." When you set a likelihood value here, you are instructingSensitive Data Protection to only match on data that corresponds to that likelihoodvalue or higher.
The default value of "Possible" is sufficient for most purposes. If youroutinely get matches that are too broad, move the slider up. If you gettoo few matches, move the slider down.
When you're done, clickContinue.
Add actions
ForAdd actions, select one or more actions forSensitive Data Protection to take after the job completes. For more information,seeEnable inspection or risk analysisactions.
After you select actions, clickContinue.
Review
TheReview section contains a JSON-formatted summary of the job settingsyou just specified.
ClickCreate to create the job (if you didn't specify a schedule) and torun the job once. The job'sinformation page appears, which contains status andother information. If the job is currently running, you can click theCancel button to stop it. You can also delete the job by clickingDelete.
To return to the main Sensitive Data Protection page, click theBack arrow inthe Google Cloud console.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingSystem;usingSystem.Linq;usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;usingstaticGoogle.Cloud.Dlp.V2.StorageConfig.Types;publicclassJobsCreate{publicstaticDlpJobCreateJob(stringprojectId,stringgcsPath){vardlp=DlpServiceClient.Create();varstorageConfig=newStorageConfig{CloudStorageOptions=newCloudStorageOptions{FileSet=newCloudStorageOptions.Types.FileSet(){Url=gcsPath}},TimespanConfig=newTimespanConfig{EnableAutoPopulationOfTimespanConfig=true}};varinspectConfig=newInspectConfig{InfoTypes={new[]{"EMAIL_ADDRESS","CREDIT_CARD_NUMBER"}.Select(it=>newInfoType(){Name=it})},IncludeQuote=true,MinLikelihood=Likelihood.Unlikely,Limits=newInspectConfig.Types.FindingLimits(){MaxFindingsPerItem=100}};varresponse=dlp.CreateDlpJob(newCreateDlpJobRequest{Parent=newLocationName(projectId,"global").ToString(),InspectJob=newInspectJobConfig{InspectConfig=inspectConfig,StorageConfig=storageConfig,}});Console.WriteLine($"Job: {response.Name} status: {response.State}");returnresponse;}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// createJob creates an inspection jobfunccreateJob(wio.Writer,projectID,gcsPathstring,infoTypeNames[]string)error{// projectID := "my-project-id"// gcsPath := "gs://" + "your-bucket-name" + "path/to/file.txt";// infoTypeNames := []string{"EMAIL_ADDRESS", "PERSON_NAME", "LOCATION", "PHONE_NUMBER"}ctx:=context.Background()// Initialize a client once and reuse it to send multiple requests. Clients// are safe to use across goroutines. When the client is no longer needed,// call the Close method to cleanup its resources.client,err:=dlp.NewClient(ctx)iferr!=nil{returnerr}// Closing the client safely cleans up background resources.deferclient.Close()// Specify the GCS file to be inspected.storageConfig:=&dlppb.StorageConfig{Type:&dlppb.StorageConfig_CloudStorageOptions{CloudStorageOptions:&dlppb.CloudStorageOptions{FileSet:&dlppb.CloudStorageOptions_FileSet{Url:gcsPath,},},},// Set autoPopulateTimespan to true to scan only new content.TimespanConfig:&dlppb.StorageConfig_TimespanConfig{EnableAutoPopulationOfTimespanConfig:true,},}// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info types.varinfoTypes[]*dlppb.InfoTypefor_,c:=rangeinfoTypeNames{infoTypes=append(infoTypes,&dlppb.InfoType{Name:c})}inspectConfig:=&dlppb.InspectConfig{InfoTypes:infoTypes,IncludeQuote:true,// The minimum likelihood required before returning a match:// See: https://cloud.google.com/dlp/docs/likelihoodMinLikelihood:dlppb.Likelihood_UNLIKELY,// The maximum number of findings to report (0 = server maximum)Limits:&dlppb.InspectConfig_FindingLimits{MaxFindingsPerItem:100,},}// Create and send the request.req:=dlppb.CreateDlpJobRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),Job:&dlppb.CreateDlpJobRequest_InspectJob{InspectJob:&dlppb.InspectJobConfig{InspectConfig:inspectConfig,StorageConfig:storageConfig,},},}// Send the request.response,err:=client.CreateDlpJob(ctx,&req)iferr!=nil{returnerr}// Print the results.fmt.Fprintf(w,"Created a Dlp Job %v and Status is: %v",response.Name,response.State)returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.Action;importcom.google.privacy.dlp.v2.CloudStorageOptions;importcom.google.privacy.dlp.v2.CreateDlpJobRequest;importcom.google.privacy.dlp.v2.DlpJob;importcom.google.privacy.dlp.v2.InfoType;importcom.google.privacy.dlp.v2.InspectConfig;importcom.google.privacy.dlp.v2.InspectJobConfig;importcom.google.privacy.dlp.v2.Likelihood;importcom.google.privacy.dlp.v2.LocationName;importcom.google.privacy.dlp.v2.StorageConfig;importcom.google.privacy.dlp.v2.StorageConfig.TimespanConfig;importjava.io.IOException;importjava.util.List;importjava.util.stream.Collectors;importjava.util.stream.Stream;publicclassJobsCreate{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";StringgcsPath="gs://"+"your-bucket-name"+"path/to/file.txt";createJobs(projectId,gcsPath);}// Creates a DLP JobpublicstaticvoidcreateJobs(StringprojectId,StringgcsPath)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Set autoPopulateTimespan to true to scan only new contentbooleanautoPopulateTimespan=true;TimespanConfigtimespanConfig=TimespanConfig.newBuilder().setEnableAutoPopulationOfTimespanConfig(autoPopulateTimespan).build();// Specify the GCS file to be inspected.CloudStorageOptionscloudStorageOptions=CloudStorageOptions.newBuilder().setFileSet(CloudStorageOptions.FileSet.newBuilder().setUrl(gcsPath)).build();StorageConfigstorageConfig=StorageConfig.newBuilder().setCloudStorageOptions(cloudStorageOptions).setTimespanConfig(timespanConfig).build();// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesList<InfoType>infoTypes=Stream.of("EMAIL_ADDRESS","PERSON_NAME","LOCATION","PHONE_NUMBER").map(it->InfoType.newBuilder().setName(it).build()).collect(Collectors.toList());// The minimum likelihood required before returning a match:// See: https://cloud.google.com/dlp/docs/likelihoodLikelihoodminLikelihood=Likelihood.UNLIKELY;// The maximum number of findings to report (0 = server maximum)InspectConfig.FindingLimitsfindingLimits=InspectConfig.FindingLimits.newBuilder().setMaxFindingsPerItem(100).build();InspectConfiginspectConfig=InspectConfig.newBuilder().addAllInfoTypes(infoTypes).setIncludeQuote(true).setMinLikelihood(minLikelihood).setLimits(findingLimits).build();// Specify the action that is triggered when the job completes.Action.PublishSummaryToCsccpublishSummaryToCscc=Action.PublishSummaryToCscc.getDefaultInstance();Actionaction=Action.newBuilder().setPublishSummaryToCscc(publishSummaryToCscc).build();// Configure the inspection job we want the service to perform.InspectJobConfiginspectJobConfig=InspectJobConfig.newBuilder().setInspectConfig(inspectConfig).setStorageConfig(storageConfig).addActions(action).build();// Construct the job creation request to be sent by the client.CreateDlpJobRequestcreateDlpJobRequest=CreateDlpJobRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).setInspectJob(inspectJobConfig).build();// Send the job creation request and process the response.DlpJobcreatedDlpJob=dlpServiceClient.createDlpJob(createDlpJobRequest);System.out.println("Job created successfully: "+createdDlpJob.getName());}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Initialize google DLP Clientconstdlp=newDLP.DlpServiceClient();asyncfunctionjobsCreate(){// Construct cloud storage configurationconstcloudStorageConfig={cloudStorageOptions:{fileSet:{url:cloudFileUrl,},},timespanConfig:{enableAutoPopulationOfTimespanConfig:true,},};// Construct inspect job configurationconstinspectJob={storageConfig:cloudStorageConfig,};// Construct inspect configurationconstinspectConfig={infoTypes:[{name:'EMAIL_ADDRESS'},{name:'PERSON_NAME'},{name:'LOCATION'},{name:'PHONE_NUMBER'},],includeQuote:true,minLikelihood:DLP.protos.google.privacy.dlp.v2.Likelihood.LIKELY,excludeInfoTypes:false,};// Combine configurations into a request for the service.constrequest={parent:`projects/${projectId}/locations/global`,inspectJob:inspectJob,inspectConfig:inspectConfig,};// Send the request and receive response from the serviceconst[response]=awaitdlp.createDlpJob(request);// Print the resultsconsole.log(`Job created successfully:${response.name}`);}jobsCreate();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Action;use Google\Cloud\Dlp\V2\Action\PublishSummaryToCscc;use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\CloudStorageOptions;use Google\Cloud\Dlp\V2\CloudStorageOptions\FileSet;use Google\Cloud\Dlp\V2\CreateDlpJobRequest;use Google\Cloud\Dlp\V2\InfoType;use Google\Cloud\Dlp\V2\InspectConfig;use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;use Google\Cloud\Dlp\V2\InspectJobConfig;use Google\Cloud\Dlp\V2\Likelihood;use Google\Cloud\Dlp\V2\StorageConfig;use Google\Cloud\Dlp\V2\StorageConfig\TimespanConfig;/** * Creates an inspection job with the Cloud Data Loss Prevention API. * * @param string $callingProjectId The project ID to run the API call under. * @param string $gcsPath GCS file to be inspected. Example : gs://GOOGLE_STORAGE_BUCKET_NAME/dlp_sample.csv */function create_job( string $callingProjectId, string $gcsPath): void { // Instantiate a client. $dlp = new DlpServiceClient(); // Set autoPopulateTimespan to true to scan only new content. $timespanConfig = (new TimespanConfig()) ->setEnableAutoPopulationOfTimespanConfig(true); // Specify the GCS file to be inspected. $cloudStorageOptions = (new CloudStorageOptions()) ->setFileSet((new FileSet()) ->setUrl($gcsPath)); $storageConfig = (new StorageConfig()) ->setCloudStorageOptions(($cloudStorageOptions)) ->setTimespanConfig($timespanConfig); // ----- Construct inspection config ----- $emailAddressInfoType = (new InfoType()) ->setName('EMAIL_ADDRESS'); $personNameInfoType = (new InfoType()) ->setName('PERSON_NAME'); $locationInfoType = (new InfoType()) ->setName('LOCATION'); $phoneNumberInfoType = (new InfoType()) ->setName('PHONE_NUMBER'); $infoTypes = [$emailAddressInfoType, $personNameInfoType, $locationInfoType, $phoneNumberInfoType]; // Whether to include the matching string in the response. $includeQuote = true; // The minimum likelihood required before returning a match. $minLikelihood = likelihood::LIKELIHOOD_UNSPECIFIED; // The maximum number of findings to report (0 = server maximum). $limits = (new FindingLimits()) ->setMaxFindingsPerRequest(100); // Create the Inspect configuration object. $inspectConfig = (new InspectConfig()) ->setMinLikelihood($minLikelihood) ->setLimits($limits) ->setInfoTypes($infoTypes) ->setIncludeQuote($includeQuote); // Specify the action that is triggered when the job completes. $action = (new Action()) ->setPublishSummaryToCscc(new PublishSummaryToCscc()); // Configure the inspection job we want the service to perform. $inspectJobConfig = (new InspectJobConfig()) ->setInspectConfig($inspectConfig) ->setStorageConfig($storageConfig) ->setActions([$action]); // Send the job creation request and process the response. $parent = "projects/$callingProjectId/locations/global"; $createDlpJobRequest = (new CreateDlpJobRequest()) ->setParent($parent) ->setInspectJob($inspectJobConfig); $job = $dlp->createDlpJob($createDlpJobRequest); // Print results. printf($job->getName());}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importgoogle.cloud.dlpdefcreate_dlp_job(project:str,bucket:str,info_types:list[str],job_id:str=None,max_findings:int=100,auto_populate_timespan:bool=True,)->None:"""Uses the Data Loss Prevention API to create a DLP job. Args: project: The project id to use as a parent resource. bucket: The name of the GCS bucket to scan. This sample scans all files in the bucket. info_types: A list of strings representing info types to look for. A full list of info type categories can be fetched from the API. job_id: The id of the job. If omitted, an id will be randomly generated. max_findings: The maximum number of findings to report; 0 = no maximum. auto_populate_timespan: Automatically populates time span config start and end times in order to scan new content only. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Convert the project id into a full resource id.parent=f"projects/{project}"# Prepare info_types by converting the list of strings into a list of# dictionaries (protos are also accepted).info_types=[{"name":info_type}forinfo_typeininfo_types]# Construct the configuration dictionary. Keys which are None may# optionally be omitted entirely.inspect_config={"info_types":info_types,"min_likelihood":google.cloud.dlp_v2.Likelihood.UNLIKELY,"limits":{"max_findings_per_request":max_findings},"include_quote":True,}# Construct a cloud_storage_options dictionary with the bucket's URL.url=f"gs://{bucket}/*"storage_config={"cloud_storage_options":{"file_set":{"url":url}},# Time-based configuration for each storage object."timespan_config":{# Auto-populate start and end times in order to scan new objects# only."enable_auto_population_of_timespan_config":auto_populate_timespan},}# Construct the job definition.job={"inspect_config":inspect_config,"storage_config":storage_config}# Call the API.response=dlp.create_dlp_job(request={"parent":parent,"inspect_job":job,"job_id":job_id})# Print out the result.print(f"Job :{response.name} status:{response.state}")REST
A job is represented in the DLP API by theDlpJobsresource. You can create a new job by using theDlpJob resource'sprojects.dlpJobs.createmethod.
This sample JSON can be sent in a POST request to the specifiedSensitive Data Protection REST endpoint. This example JSON demonstrates how tocreate a job in Sensitive Data Protection. The job is a Datastoreinspection scan.
To quickly try this out, you can use the API Explorer that's embedded below.Keep in mind that a successful request, even one created in API Explorer,will create a job. For general information about using JSON to send requests tothe DLP API, see theJSON quickstart.
JSON input:
{ "inspectJob": { "storageConfig": { "bigQueryOptions": { "tableReference": { "projectId": "bigquery-public-data", "datasetId": "san_francisco_sfpd_incidents", "tableId": "sfpd_incidents" } }, "timespanConfig": { "startTime": "2020-01-01T00:00:01Z", "endTime": "2020-01-31T23:59:59Z", "timestampField": { "name": "timestamp" } } }, "inspectConfig": { "infoTypes": [ { "name": "PERSON_NAME" }, { "name": "STREET_ADDRESS" } ], "excludeInfoTypes": false, "includeQuote": true, "minLikelihood": "LIKELY" }, "actions": [ { "saveFindings": { "outputConfig": { "table": { "projectId": "[PROJECT-ID]", "datasetId": "[DATASET-ID]" } } } } ] }}JSON output:
The following output indicates that the job was successfully created.
{"name":"projects/[PROJECT-ID]/dlpJobs/[JOB-ID]","type":"INSPECT_JOB","state":"PENDING","inspectDetails":{"requestedOptions":{"snapshotInspectTemplate":{},"jobConfig":{"storageConfig":{"bigQueryOptions":{"tableReference":{"projectId":"bigquery-public-data","datasetId":"san_francisco_sfpd_incidents","tableId":"sfpd_incidents"}},"timespanConfig":{"startTime":"2020-01-01T00:00:01Z","endTime":"2020-01-31T23:59:59Z","timestampField":{"name":"timestamp"}}},"inspectConfig":{"infoTypes":[{"name":"PERSON_NAME"},{"name":"STREET_ADDRESS"}],"minLikelihood":"LIKELY","limits":{},"includeQuote":true},"actions":[{"saveFindings":{"outputConfig":{"table":{"projectId":"[PROJECT-ID]","datasetId":"[DATASET-ID]","tableId":"[TABLE-ID]"}}}}]}},"result":{}},"createTime":"2020-07-10T07:26:33.643Z"}Create a new job trigger
To create a new Sensitive Data Protection job trigger:
Console
In the Sensitive Data Protection section of the Google Cloud console, go totheCreate job or job trigger page.
Go to Create job or job trigger
TheCreate job or job trigger page contains the following sections:
Choose input data
Name
Enter a name for the job trigger. You can use letters, numbers, and hyphens.Naming your job trigger is optional. If you don't enter a nameSensitive Data Protection will give the job trigger a unique number identifier.
Location
From theStorage type menu, choose the kind of repository that stores thedata you want to scan:
- Cloud Storage: Either enter the URL of the bucket you want to scan, or chooseInclude/exclude from theLocation type menu, and then clickBrowse to navigate to the bucket or subfolder you want to scan. Select theScan folder recursively checkbox to scan the specified directory and all contained directories. Leave it unselected to scan only the specified directory and no deeper.
- BigQuery: Enter the identifiers for the project, dataset, and table that you want to scan.
- Datastore: Enter the identifiers for the project, namespace (optional), and kind that you want to scan.
Sampling
Sampling is an optional way to save resources if you have a very large amount ofdata.
UnderSampling, you can choose whether to scan all the selected data or tosample the data by scanning a certain percentage. Sampling works differentlydepending on the type of storage repository you're scanning:
- For BigQuery, you can sample a subset of the total selectedrows, corresponding to the percentage of files you specify to include in thescan.
- For Cloud Storage, if any file exceeds the size specified in theMaxbyte size to scan per file, Sensitive Data Protection scans it up to thatmaximum file size and then moves on to the next file.
To turn on sampling, choose one of the following options from the first menu:
- Start sampling from top: Sensitive Data Protection starts the partial scanat the beginning of the data. For BigQuery, this starts thescan at the first row. For Cloud Storage, this starts the scan at thebeginning of each file, and stops scanning once Sensitive Data Protection hasscanned up to any specified maximum file size (see above).
- Start sampling from random start: Sensitive Data Protection starts thepartial scan at a random location within the data. ForBigQuery, this starts the scan at a random row. ForCloud Storage, this setting only applies to files that exceed anyspecified maximum size. Sensitive Data Protection scans files under the maximumfile size in their entirety, and scans files above the maximum file size upto the maximum.
To perform a partial scan, you must also choose what percentage of the data youwant to scan. Use the slider to set the percentage.
Note: You can also narrow the files or records to scan by date. To learn how,seeSchedule, later in this topic.Note: ClickingCreate here creates a job that runs once, immediately. If youwant to create a job trigger that will run on a periodic schedule, clickContinue. You must set aschedule to create a job trigger.Advanced configuration
When you create a job trigger for a scan of Cloud Storage bucketsor BigQuery tables, you can narrow your search by specifying anadvanced configuration. Specifically, you can configure:
- Files (Cloud Storage only): The file types to scan for, whichinclude text, binary, and image files.
- Identifying fields (BigQuery only):Unique row identifiers within the table.
- For Cloud Storage, if any file exceeds the size specified in theMaxbyte size to scan per file, Sensitive Data Protection scans it up to thatmaximum file size and then moves on to the next file.
To turn on sampling, choose what percentage of the data you want to scan. Usethe slider to set the percentage. Then, choose one of the following optionsfrom the first menu:
- Start sampling from top: Sensitive Data Protection starts the partial scanat the beginning of the data. For BigQuery, this starts thescan at the first row. For Cloud Storage, this starts the scan at thebeginning of each file, and stops scanning once Sensitive Data Protection hasscanned up to any specified maximum file size (see above).
- Start sampling from random start: Sensitive Data Protection starts thepartial scan at a random location within the data. ForBigQuery, this starts the scan at a random row. ForCloud Storage, this setting only applies to files that exceed anyspecified maximum size. Sensitive Data Protection scans files under the maximumfile size in their entirety, and scans files above the maximum file size upto the maximum.
Files
For files stored in Cloud Storage, you can specify the types to include inyour scan underFiles.
You can choose from binary, text, image, Microsoft Word, Microsoft Excel,Microsoft Powerpoint, PDF, and Apache Avrofiles. For an exhaustive list of file extensions that Sensitive Data Protection canscan in Cloud Storage buckets, seeFileType.ChoosingBinary causes Sensitive Data Protection to scan files oftypes that are unrecognized.
Identifying fields
For tables in BigQuery, in theIdentifying fields field, youcan direct Sensitive Data Protection to include the values of the table's primarykey columns in the results. Doing so lets you link the findings back to thetable rows that contain them.
Enter the names of the columns that uniquely identify each row within thetable. If necessary, use dot notation to specify nested fields. You canadd as many fields as you want.
You must also turn on theSave to BigQuery action to export the findings toBigQuery. When the findings are exported to BigQuery, each findingcontains the respective values of the identifying fields. For more information, seeidentifyingFields.
Configure detection
TheConfigure detection section is where you specify the types of sensitivedata you want to scan for. Completing this section is optional. If you skipthis section, Sensitive Data Protection will scan your data for a default set ofinfoTypes.
Template
You can optionally use a Sensitive Data Protection template to reuse configurationinformation you've specified previously.
If you have already created a template that you want to use, click in theTemplate name field to see a list of existing inspection templates. Chooseor type the name of the template you want to use.
For more information about creating templates, seeCreatingSensitive Data Protection inspection templates.
InfoTypes
InfoType detectors find sensitive data of a certain type. For example, theSensitive Data ProtectionUS_SOCIAL_SECURITY_NUMBER built-in infoType detectorfinds US Social Security numbers. In addition to the built-in infoTypedetectors, you can create your owncustom infoTypedetectors.
UnderInfoTypes, choose the infoType detector that corresponds to a datatype you want to scan for. You can also leave this field blank to scan for alldefault infoTypes. More information about each detector is provided inInfoType detector reference.
You can also add custom infoType detectors in theCustom infoTypes section,and customize both built-in and custom infoType detectors in theInspectionrulesets section.
Custom infoTypes
To add custom infoTypes, seeAdd custominfoTypes.
Inspection rulesets
Inspection rulesets allow you to customize both built-in and custom infoType detectors using context rules. The two types of inspection rules are:
- Exclusion rules, which help exclude false or unwanted findings.
- Hotword rules, which help detect additional findings.
To add a new ruleset, first specify one or more built-in or custom infoType detectors in theInfoTypes section. These are the infoType detectors that your rulesets will be modifying. Then, do the following:
- Click in theChoose infoTypes field. The infoType or infoTypes you specified previously appear below the field in a menu.
- Choose an infoType from the menu, and then clickAdd rule. A menu appears with the two optionsHotword rule andExclusion rule.
For hotword rules, chooseHotword rules. Then, do the following:
- In theHotword field, enter a regular expression that Sensitive Data Protection should look for.
- From theHotword proximity menu, choose whether the hotword you entered is found before or after the chosen infoType.
- InHotword distance from infoType, enter the approximate number of characters between the hotword and the chosen infoType.
- InConfidence level adjustment, choose whether to assign matches a fixedlikelihood level, or to increase or decrease the default likelihood level by a certain amount.
For exclusion rules, chooseExclusion rules. Then, do the following:
- In theExclude field, enter a regular expression (regex) that Sensitive Data Protection should look for.
- From theMatching type menu, choose one of the following:
- Full match: The finding must completely match the regex.
- Partial match: A substring of the finding can match the regex.
- Inverse match: The finding doesn't match the regex.
You can add additional hotword or exclusion rules and rulesets to further refine your scan results.
Confidence threshold
Every time Sensitive Data Protection detects a potential match for sensitive data,it assigns it alikelihood value on a scale from "Very unlikely"to "Very likely." When you set a likelihood value here, you are instructingSensitive Data Protection to only match on data that corresponds to that likelihoodvalue or higher.
The default value of "Possible" is sufficient for most purposes. If youroutinely get matches that are too broad, move the slider up. If you gettoo few matches, move the slider down.
When you're done, clickContinue.
Add actions
ForAdd actions, select one or more actions forSensitive Data Protection to take after the job completes. For more information,seeEnable inspection or risk analysisactions.
After you select actions, clickContinue.
Schedule
In theSchedule section, you can do two things:
- Specify time span: This option limits the files or rows to scan by date.ClickStart time to specify the earliest file timestamp to include.Leave this value blank to specify all files. ClickEnd time to specifythe latest file timestamp to include. Leave this value blank to specifyno upper timestamp limit.
Create a trigger to run the job on a periodic schedule: This optionturns the job into a job trigger that runs on a periodic schedule. If youdon't specify a schedule, you effectively create a single job that startsimmediately and runs once. To create a job trigger that runs regularly, youmust set this option.
The default value is also the minimum value: 24 hours. The maximum value is60 days.
If you want Sensitive Data Protection to scan only new files or rows, selectLimit scans only to new content. For BigQuery inspection, onlyrows that are at least three hours old are included in the scan. See theknown issuerelated to this operation.
Review
TheReview section contains a JSON-formatted summary of the job settingsyou just specified.
ClickCreate to create the job trigger (if you specified a schedule). Thejob trigger's information page appears, which contains status and otherinformation. If the job is currently running, you can click theCancelbutton to stop it. You can also delete the job trigger by clickingDelete.
To return to the main Sensitive Data Protection page, click theBack arrow inthe Google Cloud console.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;usingSystem;usingSystem.Collections.Generic;usingstaticGoogle.Cloud.Dlp.V2.CloudStorageOptions.Types;usingstaticGoogle.Cloud.Dlp.V2.InspectConfig.Types;usingstaticGoogle.Cloud.Dlp.V2.JobTrigger.Types;usingstaticGoogle.Cloud.Dlp.V2.StorageConfig.Types;publicclassTriggersCreate{publicstaticJobTriggerCreate(stringprojectId,stringbucketName,LikelihoodminLikelihood,intmaxFindings,boolautoPopulateTimespan,intscanPeriod,IEnumerable<InfoType>infoTypes,stringtriggerId,stringdisplayName,stringdescription){vardlp=DlpServiceClient.Create();varjobConfig=newInspectJobConfig{InspectConfig=newInspectConfig{MinLikelihood=minLikelihood,Limits=newFindingLimits{MaxFindingsPerRequest=maxFindings},InfoTypes={infoTypes}},StorageConfig=newStorageConfig{CloudStorageOptions=newCloudStorageOptions{FileSet=newFileSet{Url=$"gs://{bucketName}/*"}},TimespanConfig=newTimespanConfig{EnableAutoPopulationOfTimespanConfig=autoPopulateTimespan}}};varjobTrigger=newJobTrigger{Triggers={newTrigger{Schedule=newSchedule{RecurrencePeriodDuration=newGoogle.Protobuf.WellKnownTypes.Duration{Seconds=scanPeriod*60*60*24}}}},InspectJob=jobConfig,Status=Status.Healthy,DisplayName=displayName,Description=description};varresponse=dlp.CreateJobTrigger(newCreateJobTriggerRequest{Parent=newLocationName(projectId,"global").ToString(),JobTrigger=jobTrigger,TriggerId=triggerId});Console.WriteLine($"Successfully created trigger {response.Name}");returnresponse;}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb""github.com/golang/protobuf/ptypes/duration")// createTrigger creates a trigger with the given configuration.funccreateTrigger(wio.Writer,projectIDstring,triggerID,displayName,description,bucketNamestring,infoTypeNames[]string)error{// projectID := "my-project-id"// triggerID := "my-trigger"// displayName := "My Trigger"// description := "My trigger description"// bucketName := "my-bucket"// infoTypeNames := []string{"US_SOCIAL_SECURITY_NUMBER"}ctx:=context.Background()client,err:=dlp.NewClient(ctx)iferr!=nil{returnfmt.Errorf("dlp.NewClient: %w",err)}deferclient.Close()// Convert the info type strings to a list of InfoTypes.varinfoTypes[]*dlppb.InfoTypefor_,it:=rangeinfoTypeNames{infoTypes=append(infoTypes,&dlppb.InfoType{Name:it})}// Create a configured request.req:=&dlppb.CreateJobTriggerRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),TriggerId:triggerID,JobTrigger:&dlppb.JobTrigger{DisplayName:displayName,Description:description,Status:dlppb.JobTrigger_HEALTHY,// Triggers control when the job will start.Triggers:[]*dlppb.JobTrigger_Trigger{{Trigger:&dlppb.JobTrigger_Trigger_Schedule{Schedule:&dlppb.Schedule{Option:&dlppb.Schedule_RecurrencePeriodDuration{RecurrencePeriodDuration:&duration.Duration{Seconds:10*60*60*24,// 10 days in seconds.},},},},},},// Job configures the job to run when the trigger runs.Job:&dlppb.JobTrigger_InspectJob{InspectJob:&dlppb.InspectJobConfig{InspectConfig:&dlppb.InspectConfig{InfoTypes:infoTypes,MinLikelihood:dlppb.Likelihood_POSSIBLE,Limits:&dlppb.InspectConfig_FindingLimits{MaxFindingsPerRequest:10,},},StorageConfig:&dlppb.StorageConfig{Type:&dlppb.StorageConfig_CloudStorageOptions{CloudStorageOptions:&dlppb.CloudStorageOptions{FileSet:&dlppb.CloudStorageOptions_FileSet{Url:"gs://"+bucketName+"/*",},},},// Time-based configuration for each storage object. See more at// https://cloud.google.com/dlp/docs/reference/rest/v2/InspectJobConfig#TimespanConfigTimespanConfig:&dlppb.StorageConfig_TimespanConfig{// Auto-populate start and end times in order to scan new objects only.EnableAutoPopulationOfTimespanConfig:true,},},},},},}// Send the request.resp,err:=client.CreateJobTrigger(ctx,req)iferr!=nil{returnfmt.Errorf("CreateJobTrigger: %w",err)}fmt.Fprintf(w,"Successfully created trigger: %v",resp.GetName())returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.CloudStorageOptions;importcom.google.privacy.dlp.v2.CreateJobTriggerRequest;importcom.google.privacy.dlp.v2.InfoType;importcom.google.privacy.dlp.v2.InspectConfig;importcom.google.privacy.dlp.v2.InspectJobConfig;importcom.google.privacy.dlp.v2.JobTrigger;importcom.google.privacy.dlp.v2.LocationName;importcom.google.privacy.dlp.v2.Schedule;importcom.google.privacy.dlp.v2.StorageConfig;importcom.google.privacy.dlp.v2.StorageConfig.TimespanConfig;importcom.google.protobuf.Duration;importjava.io.IOException;importjava.util.List;importjava.util.stream.Collectors;importjava.util.stream.Stream;publicclassTriggersCreate{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";StringgcsPath="gs://"+"your-bucket-name"+"path/to/file.txt";createTrigger(projectId,gcsPath);}publicstaticvoidcreateTrigger(StringprojectId,StringgcsPath)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Set autoPopulateTimespan to true to scan only new contentbooleanautoPopulateTimespan=true;TimespanConfigtimespanConfig=TimespanConfig.newBuilder().setEnableAutoPopulationOfTimespanConfig(autoPopulateTimespan).build();// Specify the GCS file to be inspected.CloudStorageOptionscloudStorageOptions=CloudStorageOptions.newBuilder().setFileSet(CloudStorageOptions.FileSet.newBuilder().setUrl(gcsPath)).build();StorageConfigstorageConfig=StorageConfig.newBuilder().setCloudStorageOptions(cloudStorageOptions).setTimespanConfig(timespanConfig).build();// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesList<InfoType>infoTypes=Stream.of("PHONE_NUMBER","EMAIL_ADDRESS","CREDIT_CARD_NUMBER").map(it->InfoType.newBuilder().setName(it).build()).collect(Collectors.toList());InspectConfiginspectConfig=InspectConfig.newBuilder().addAllInfoTypes(infoTypes).build();// Configure the inspection job we want the service to perform.InspectJobConfiginspectJobConfig=InspectJobConfig.newBuilder().setInspectConfig(inspectConfig).setStorageConfig(storageConfig).build();// Set scanPeriod to the number of days between scans (minimum: 1 day)intscanPeriod=1;// Optionally set a display name of max 100 chars and a description of max 250 charsStringdisplayName="Daily Scan";Stringdescription="A daily inspection for personally identifiable information.";// Schedule scan of GCS bucket every scanPeriod number of days (minimum = 1 day)Durationduration=Duration.newBuilder().setSeconds(scanPeriod*24*3600).build();Scheduleschedule=Schedule.newBuilder().setRecurrencePeriodDuration(duration).build();JobTrigger.Triggertrigger=JobTrigger.Trigger.newBuilder().setSchedule(schedule).build();JobTriggerjobTrigger=JobTrigger.newBuilder().setInspectJob(inspectJobConfig).setDisplayName(displayName).setDescription(description).setStatus(JobTrigger.Status.HEALTHY).addTriggers(trigger).build();// Create scan request to be sent by clientCreateJobTriggerRequestcreateJobTriggerRequest=CreateJobTriggerRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).setJobTrigger(jobTrigger).build();// Send the scan request and process the responseJobTriggercreatedJobTrigger=dlpServiceClient.createJobTrigger(createJobTriggerRequest);System.out.println("Created Trigger: "+createdJobTrigger.getName());System.out.println("Display Name: "+createdJobTrigger.getDisplayName());System.out.println("Description: "+createdJobTrigger.getDescription());}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'my-project';// (Optional) The name of the trigger to be created.// const triggerId = 'my-trigger';// (Optional) A display name for the trigger to be created// const displayName = 'My Trigger';// (Optional) A description for the trigger to be created// const description = "This is a sample trigger.";// The name of the bucket to scan.// const bucketName = 'YOUR-BUCKET';// Limit scan to new content only.// const autoPopulateTimespan = true;// How often to wait between scans, in days (minimum = 1 day)// const scanPeriod = 1;// The infoTypes of information to match// const infoTypes = [{ name: 'PHONE_NUMBER' }, { name: 'EMAIL_ADDRESS' }, { name: 'CREDIT_CARD_NUMBER' }];// The minimum likelihood required before returning a match// const minLikelihood = 'LIKELIHOOD_UNSPECIFIED';// The maximum number of findings to report per request (0 = server maximum)// const maxFindings = 0;asyncfunctioncreateTrigger(){// Get reference to the bucket to be inspectedconststorageItem={cloudStorageOptions:{fileSet:{url:`gs://${bucketName}/*`},},timeSpanConfig:{enableAutoPopulationOfTimespanConfig:autoPopulateTimespan,},};// Construct job to be triggeredconstjob={inspectConfig:{infoTypes:infoTypes,minLikelihood:minLikelihood,limits:{maxFindingsPerRequest:maxFindings,},},storageConfig:storageItem,};// Construct trigger creation requestconstrequest={parent:`projects/${projectId}/locations/global`,jobTrigger:{inspectJob:job,displayName:displayName,description:description,triggers:[{schedule:{recurrencePeriodDuration:{seconds:scanPeriod*60*60*24,// Trigger the scan daily},},},],status:'HEALTHY',},triggerId:triggerId,};// Run trigger creation requestconst[trigger]=awaitdlp.createJobTrigger(request);console.log(`Successfully created trigger${trigger.name}.`);}createTrigger();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\CloudStorageOptions;use Google\Cloud\Dlp\V2\CloudStorageOptions\FileSet;use Google\Cloud\Dlp\V2\CreateJobTriggerRequest;use Google\Cloud\Dlp\V2\InfoType;use Google\Cloud\Dlp\V2\InspectConfig;use Google\Cloud\Dlp\V2\InspectConfig\FindingLimits;use Google\Cloud\Dlp\V2\InspectJobConfig;use Google\Cloud\Dlp\V2\JobTrigger;use Google\Cloud\Dlp\V2\JobTrigger\Status;use Google\Cloud\Dlp\V2\JobTrigger\Trigger;use Google\Cloud\Dlp\V2\Likelihood;use Google\Cloud\Dlp\V2\Schedule;use Google\Cloud\Dlp\V2\StorageConfig;use Google\Cloud\Dlp\V2\StorageConfig\TimespanConfig;use Google\Protobuf\Duration;/** * Create a Data Loss Prevention API job trigger. * * @param string $callingProjectId The project ID to run the API call under * @param string $bucketName The name of the bucket to scan * @param string $triggerId (Optional) The name of the trigger to be created * @param string $displayName (Optional) The human-readable name to give the trigger * @param string $description (Optional) A description for the trigger to be created * @param int $scanPeriod (Optional) How often to wait between scans, in days (minimum = 1 day) * @param bool $autoPopulateTimespan (Optional) Automatically limit scan to new content only * @param int $maxFindings (Optional) The maximum number of findings to report per request (0 = server maximum) */function create_trigger( string $callingProjectId, string $bucketName, string $triggerId, string $displayName, string $description, int $scanPeriod, bool $autoPopulateTimespan, int $maxFindings): void { // Instantiate a client. $dlp = new DlpServiceClient(); // ----- Construct job config ----- // The infoTypes of information to match $personNameInfoType = (new InfoType()) ->setName('PERSON_NAME'); $phoneNumberInfoType = (new InfoType()) ->setName('PHONE_NUMBER'); $infoTypes = [$personNameInfoType, $phoneNumberInfoType]; // The minimum likelihood required before returning a match $minLikelihood = likelihood::LIKELIHOOD_UNSPECIFIED; // Specify finding limits $limits = (new FindingLimits()) ->setMaxFindingsPerRequest($maxFindings); // Create the inspectConfig object $inspectConfig = (new InspectConfig()) ->setMinLikelihood($minLikelihood) ->setLimits($limits) ->setInfoTypes($infoTypes); // Create triggers $duration = (new Duration()) ->setSeconds($scanPeriod * 60 * 60 * 24); $schedule = (new Schedule()) ->setRecurrencePeriodDuration($duration); $triggerObject = (new Trigger()) ->setSchedule($schedule); // Create the storageConfig object $fileSet = (new FileSet()) ->setUrl('gs://' . $bucketName . '/*'); $storageOptions = (new CloudStorageOptions()) ->setFileSet($fileSet); // Auto-populate start and end times in order to scan new objects only. $timespanConfig = (new TimespanConfig()) ->setEnableAutoPopulationOfTimespanConfig($autoPopulateTimespan); $storageConfig = (new StorageConfig()) ->setCloudStorageOptions($storageOptions) ->setTimespanConfig($timespanConfig); // Construct the jobConfig object $jobConfig = (new InspectJobConfig()) ->setInspectConfig($inspectConfig) ->setStorageConfig($storageConfig); // ----- Construct trigger object ----- $jobTriggerObject = (new JobTrigger()) ->setTriggers([$triggerObject]) ->setInspectJob($jobConfig) ->setStatus(Status::HEALTHY) ->setDisplayName($displayName) ->setDescription($description); // Run trigger creation request $parent = $dlp->locationName($callingProjectId, 'global'); $createJobTriggerRequest = (new CreateJobTriggerRequest()) ->setParent($parent) ->setJobTrigger($jobTriggerObject) ->setTriggerId($triggerId); $trigger = $dlp->createJobTrigger($createJobTriggerRequest); // Print results printf('Successfully created trigger %s' . PHP_EOL, $trigger->getName());}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
fromtypingimportOptionalimportgoogle.cloud.dlpdefcreate_trigger(project:str,bucket:str,scan_period_days:int,info_types:List[str],trigger_id:Optional[str]=None,display_name:Optional[str]=None,description:Optional[str]=None,min_likelihood:Optional[int]=None,max_findings:Optional[int]=None,auto_populate_timespan:Optional[bool]=False,)->None:"""Creates a scheduled Data Loss Prevention API inspect_content trigger. Args: project: The Google Cloud project id to use as a parent resource. bucket: The name of the GCS bucket to scan. This sample scans all files in the bucket using a wildcard. scan_period_days: How often to repeat the scan, in days. The minimum is 1 day. info_types: A list of strings representing info types to look for. A full list of info type categories can be fetched from the API. trigger_id: The id of the trigger. If omitted, an id will be randomly generated. display_name: The optional display name of the trigger. description: The optional description of the trigger. min_likelihood: A string representing the minimum likelihood threshold that constitutes a match. One of: 'LIKELIHOOD_UNSPECIFIED', 'VERY_UNLIKELY', 'UNLIKELY', 'POSSIBLE', 'LIKELY', 'VERY_LIKELY'. max_findings: The maximum number of findings to report; 0 = no maximum. auto_populate_timespan: Automatically populates time span config start and end times in order to scan new content only. Returns: None; the response from the API is printed to the terminal. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Prepare info_types by converting the list of strings into a list of# dictionaries (protos are also accepted).info_types=[{"name":info_type}forinfo_typeininfo_types]# Construct the configuration dictionary. Keys which are None may# optionally be omitted entirely.inspect_config={"info_types":info_types,"min_likelihood":min_likelihood,"limits":{"max_findings_per_request":max_findings},}# Construct a cloud_storage_options dictionary with the bucket's URL.url=f"gs://{bucket}/*"storage_config={"cloud_storage_options":{"file_set":{"url":url}},# Time-based configuration for each storage object."timespan_config":{# Auto-populate start and end times in order to scan new objects# only."enable_auto_population_of_timespan_config":auto_populate_timespan},}# Construct the job definition.job={"inspect_config":inspect_config,"storage_config":storage_config}# Construct the schedule definition:schedule={"recurrence_period_duration":{"seconds":scan_period_days*60*60*24}}# Construct the trigger definition.job_trigger={"inspect_job":job,"display_name":display_name,"description":description,"triggers":[{"schedule":schedule}],"status":google.cloud.dlp_v2.JobTrigger.Status.HEALTHY,}# Convert the project id into a full resource id.parent=f"projects/{project}"# Call the API.response=dlp.create_job_trigger(request={"parent":parent,"job_trigger":job_trigger,"trigger_id":trigger_id})print(f"Successfully created trigger{response.name}")REST
A job trigger is represented in the DLP API by theJobTriggerresource. You can create a new job trigger by using theJobTrigger resource'sprojects.jobTriggers.createmethod.
This sample JSON can be sent in a POST request to the specifiedSensitive Data Protection REST endpoint. This example JSON demonstrates how tocreate a job trigger in Sensitive Data Protection. The job that this triggerwill kick off is a Datastore inspection scan. The job triggerthat is created runs every 86,400 seconds (or 24 hours).
To quickly try this out, you can use the API Explorer that's embedded below.Keep in mind that a successful request, even one created in API Explorer,will create a new scheduled job trigger. For general information about usingJSON to send requests to the DLP API, see theJSONquickstart.
JSON input:
{"jobTrigger":{"displayName":"JobTrigger1","description":"Starts an inspection of a Datastore kind","triggers":[ { "schedule":{ "recurrencePeriodDuration":"86400s" } }],"status":"HEALTHY","inspectJob":{"storageConfig":{"datastoreOptions":{"kind":{"name":"Example-Kind"},"partitionId":{"projectId":"[PROJECT_ID]","namespaceId":"[NAMESPACE_ID]"}}},"inspectConfig":{"infoTypes":[ { "name":"PHONE_NUMBER" }],"excludeInfoTypes":false,"includeQuote":true,"minLikelihood":"LIKELY"},"actions":[ { "saveFindings":{ "outputConfig":{ "table":{ "projectId":"[PROJECT_ID]", "datasetId":"[BIGQUERY_DATASET_NAME]", "tableId":"[BIGQUERY_TABLE_NAME]"}}}}]}}}JSON output:
The following output indicates that the job trigger was successfully created.
{"name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]","displayName":"JobTrigger1","description":"Starts an inspection of a Datastore kind","inspectJob":{"storageConfig":{"datastoreOptions":{"partitionId":{"projectId":"[PROJECT_ID]","namespaceId":"[NAMESPACE_ID]"},"kind":{"name":"Example-Kind"}}},"inspectConfig":{"infoTypes":[ { "name":"PHONE_NUMBER" }],"minLikelihood":"LIKELY","limits":{},"includeQuote":true},"actions":[ { "saveFindings":{ "outputConfig":{ "table":{ "projectId":"[PROJECT_ID]", "datasetId":"[BIGQUERY_DATASET_NAME]", "tableId":"[BIGQUERY_TABLE_NAME]" } } } } ] }, "triggers":[ { "schedule":{ "recurrencePeriodDuration":"86400s" } } ], "createTime":"2018-11-30T01:52:41.171857Z", "updateTime":"2018-11-30T01:52:41.171857Z", "status":"HEALTHY"}List all jobs
To list all jobs for the current project:
Console
In the Google Cloud console, go to the Sensitive Data Protection page.
Click theInspection tab, and then click theInspect jobs subtab.
The console displays a list of all jobs for the current project, including theirjob identifiers, state, creation time, and end time. You can get moreinformation about any job—including a summary of its results— byclicking its identifier.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingGoogle.Api.Gax;usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;publicclassJobsList{publicstaticPagedEnumerable<ListDlpJobsResponse,DlpJob>ListDlpJobs(stringprojectId,stringfilter,DlpJobTypejobType){vardlp=DlpServiceClient.Create();varresponse=dlp.ListDlpJobs(newListDlpJobsRequest{Parent=newLocationName(projectId,"global").ToString(),Filter=filter,Type=jobType});// Uncomment to print jobs// foreach (var job in response)// {// Console.WriteLine($"Job: {job.Name} status: {job.State}");// }returnresponse;}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb""google.golang.org/api/iterator")// listJobs lists jobs matching the given optional filter and optional jobType.funclistJobs(wio.Writer,projectID,filter,jobTypestring)error{// projectID := "my-project-id"// filter := "`state` = FINISHED"// jobType := "RISK_ANALYSIS_JOB"ctx:=context.Background()client,err:=dlp.NewClient(ctx)iferr!=nil{returnfmt.Errorf("dlp.NewClient: %w",err)}deferclient.Close()// Create a configured request.req:=&dlppb.ListDlpJobsRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),Filter:filter,Type:dlppb.DlpJobType(dlppb.DlpJobType_value[jobType]),}// Send the request and iterate over the results.it:=client.ListDlpJobs(ctx,req)for{j,err:=it.Next()iferr==iterator.Done{break}iferr!=nil{returnfmt.Errorf("Next: %w",err)}fmt.Fprintf(w,"Job %v status: %v\n",j.GetName(),j.GetState())}returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.DlpJob;importcom.google.privacy.dlp.v2.DlpJobType;importcom.google.privacy.dlp.v2.ListDlpJobsRequest;importcom.google.privacy.dlp.v2.LocationName;importjava.io.IOException;publicclassJobsList{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";listJobs(projectId);}// Lists DLP jobspublicstaticvoidlistJobs(StringprojectId)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Construct the request to be sent by the client.// For more info on filters and job types,// see https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs/listListDlpJobsRequestlistDlpJobsRequest=ListDlpJobsRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).setFilter("state=DONE").setType(DlpJobType.valueOf("INSPECT_JOB")).build();// Send the request to list jobs and process the responseDlpServiceClient.ListDlpJobsPagedResponseresponse=dlpServiceClient.listDlpJobs(listDlpJobsRequest);System.out.println("DLP jobs found:");for(DlpJobdlpJob:response.getPage().getValues()){System.out.println(dlpJob.getName()+" -- "+dlpJob.getState());}}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'my-project';// The filter expression to use// For more information and filter syntax, see https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs/list// const filter = `state=DONE`;// The type of job to list (either 'INSPECT_JOB' or 'RISK_ANALYSIS_JOB')// const jobType = 'INSPECT_JOB';asyncfunctionlistJobs(){// Construct request for listing DLP scan jobsconstrequest={parent:`projects/${projectId}/locations/global`,filter:filter,type:jobType,};// Run job-listing requestconst[jobs]=awaitdlp.listDlpJobs(request);jobs.forEach(job=>{console.log(`Job${job.name} status:${job.state}`);});}listJobs();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\DlpJob\JobState;use Google\Cloud\Dlp\V2\DlpJobType;use Google\Cloud\Dlp\V2\ListDlpJobsRequest;/** * List Data Loss Prevention API jobs corresponding to a given filter. * * @param string $callingProjectId The project ID to run the API call under * @param string $filter The filter expression to use */function list_jobs(string $callingProjectId, string $filter): void{ // Instantiate a client. $dlp = new DlpServiceClient(); // The type of job to list (either 'INSPECT_JOB' or 'REDACT_JOB') $jobType = DlpJobType::INSPECT_JOB; // Run job-listing request // For more information and filter syntax, // @see https://cloud.google.com/dlp/docs/reference/rest/v2/projects.dlpJobs/list $parent = "projects/$callingProjectId/locations/global"; $listDlpJobsRequest = (new ListDlpJobsRequest()) ->setParent($parent) ->setFilter($filter) ->setType($jobType); $response = $dlp->listDlpJobs($listDlpJobsRequest); // Print job list $jobs = $response->iterateAllElements(); foreach ($jobs as $job) { printf('Job %s status: %s' . PHP_EOL, $job->getName(), $job->getState()); $infoTypeStats = $job->getInspectDetails()->getResult()->getInfoTypeStats(); if ($job->getState() == JobState::DONE) { if (count($infoTypeStats) > 0) { foreach ($infoTypeStats as $infoTypeStat) { printf( ' Found %s instance(s) of type %s' . PHP_EOL, $infoTypeStat->getCount(), $infoTypeStat->getInfoType()->getName() ); } } else { print(' No findings.' . PHP_EOL); } } }}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
fromtypingimportOptionalimportgoogle.cloud.dlpdeflist_dlp_jobs(project:str,filter_string:Optional[str]=None,job_type:Optional[str]=None)->None:"""Uses the Data Loss Prevention API to lists DLP jobs that match the specified filter in the request. Args: project: The project id to use as a parent resource. filter: (Optional) Allows filtering. Supported syntax: * Filter expressions are made up of one or more restrictions. * Restrictions can be combined by 'AND' or 'OR' logical operators. A sequence of restrictions implicitly uses 'AND'. * A restriction has the form of '<field> <operator> <value>'. * Supported fields/values for inspect jobs: - `state` - PENDING|RUNNING|CANCELED|FINISHED|FAILED - `inspected_storage` - DATASTORE|CLOUD_STORAGE|BIGQUERY - `trigger_name` - The resource name of the trigger that created job. * Supported fields for risk analysis jobs: - `state` - RUNNING|CANCELED|FINISHED|FAILED * The operator must be '=' or '!='. Examples: * inspected_storage = cloud_storage AND state = done * inspected_storage = cloud_storage OR inspected_storage = bigquery * inspected_storage = cloud_storage AND (state = done OR state = canceled) type: (Optional) The type of job. Defaults to 'INSPECT'. Choices: DLP_JOB_TYPE_UNSPECIFIED INSPECT_JOB: The job inspected content for sensitive data. RISK_ANALYSIS_JOB: The job executed a Risk Analysis computation. Returns: None; the response from the API is printed to the terminal. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Convert the project id into a full resource id.parent=f"projects/{project}"# Job type dictionaryjob_type_to_int={"DLP_JOB_TYPE_UNSPECIFIED":google.cloud.dlp.DlpJobType.DLP_JOB_TYPE_UNSPECIFIED,"INSPECT_JOB":google.cloud.dlp.DlpJobType.INSPECT_JOB,"RISK_ANALYSIS_JOB":google.cloud.dlp.DlpJobType.RISK_ANALYSIS_JOB,}# If job type is specified, convert job type to number through enums.ifjob_type:job_type=job_type_to_int[job_type]# Call the API to get a list of jobs.response=dlp.list_dlp_jobs(request={"parent":parent,"filter":filter_string,"type_":job_type})# Iterate over results.forjobinresponse:print(f"Job:{job.name}; status:{job.state.name}")REST
TheDlpJob resource has aprojects.dlpJobs.listmethod, with which you can list all jobs.
To list all jobs currently defined in your project, send a GET requestto thedlpJobs endpoint, as shownhere:
URL:
GET https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/dlpJobs?key={YOUR_API_KEY}The following JSON output lists one of the jobs returned. Note that thestructure of the job mirrors that of theDlpJob resource.
JSON output:
{"jobs":[{"name":"projects/[PROJECT-ID]/dlpJobs/i-5270277269264714623","type":"INSPECT_JOB","state":"DONE","inspectDetails":{"requestedOptions":{"snapshotInspectTemplate":{},"jobConfig":{"storageConfig":{"cloudStorageOptions":{"fileSet":{"url":"[CLOUD-STORAGE-URL]"},"fileTypes":["FILE_TYPE_UNSPECIFIED"],"filesLimitPercent":100},"timespanConfig":{"startTime":"2019-09-08T22:43:16.623Z","enableAutoPopulationOfTimespanConfig":true}},"inspectConfig":{"infoTypes":[{"name":"US_SOCIAL_SECURITY_NUMBER"},{"name":"CANADA_SOCIAL_INSURANCE_NUMBER"}],"minLikelihood":"LIKELY","limits":{},"includeQuote":true},"actions":[{"saveFindings":{"outputConfig":{"table":{"projectId":"[PROJECT-ID]","datasetId":"[DATASET-ID]","tableId":"[TABLE-ID]"}}}}]}},"result":{...}},"createTime":"2019-09-09T22:43:16.918Z","startTime":"2019-09-09T22:43:16.918Z","endTime":"2019-09-09T22:43:53.091Z","jobTriggerName":"projects/[PROJECT-ID]/jobTriggers/sample-trigger2"},...To quickly try this out, you can use the API Explorer that's embedded below.For general information about using JSON to send requests to theDLP API, see theJSON quickstart.
List all job triggers
To list all job triggers for the current project:
Console
In the Google Cloud console, go to the Sensitive Data Protection page.
Go to Sensitive Data Protection
On theInspection tab, on theJob triggers subtab, the console displaysa list of all job triggers for the current project.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingGoogle.Api.Gax;usingGoogle.Api.Gax.ResourceNames;usingGoogle.Cloud.Dlp.V2;usingSystem;publicclassTriggersList{publicstaticPagedEnumerable<ListJobTriggersResponse,JobTrigger>List(stringprojectId){vardlp=DlpServiceClient.Create();varresponse=dlp.ListJobTriggers(newListJobTriggersRequest{Parent=newLocationName(projectId,"global").ToString(),});foreach(vartriggerinresponse){Console.WriteLine($"Name: {trigger.Name}");Console.WriteLine($" Created: {trigger.CreateTime}");Console.WriteLine($" Updated: {trigger.UpdateTime}");Console.WriteLine($" Display Name: {trigger.DisplayName}");Console.WriteLine($" Description: {trigger.Description}");Console.WriteLine($" Status: {trigger.Status}");Console.WriteLine($" Error count: {trigger.Errors.Count}");}returnresponse;}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io""time"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb""github.com/golang/protobuf/ptypes""google.golang.org/api/iterator")// listTriggers lists the triggers for the given project.funclistTriggers(wio.Writer,projectIDstring)error{// projectID := "my-project-id"ctx:=context.Background()client,err:=dlp.NewClient(ctx)iferr!=nil{returnfmt.Errorf("dlp.NewClient: %w",err)}deferclient.Close()// Create a configured request.req:=&dlppb.ListJobTriggersRequest{Parent:fmt.Sprintf("projects/%s/locations/global",projectID),}// Send the request and iterate over the results.it:=client.ListJobTriggers(ctx,req)for{t,err:=it.Next()iferr==iterator.Done{break}iferr!=nil{returnfmt.Errorf("Next: %w",err)}fmt.Fprintf(w,"Trigger %v\n",t.GetName())c,err:=ptypes.Timestamp(t.GetCreateTime())iferr!=nil{returnfmt.Errorf("CreateTime Timestamp: %w",err)}fmt.Fprintf(w," Created: %v\n",c.Format(time.RFC1123))u,err:=ptypes.Timestamp(t.GetUpdateTime())iferr!=nil{returnfmt.Errorf("UpdateTime Timestamp: %w",err)}fmt.Fprintf(w," Updated: %v\n",u.Format(time.RFC1123))fmt.Fprintf(w," Display Name: %q\n",t.GetDisplayName())fmt.Fprintf(w," Description: %q\n",t.GetDescription())fmt.Fprintf(w," Status: %v\n",t.GetStatus())fmt.Fprintf(w," Error Count: %v\n",len(t.GetErrors()))}returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.JobTrigger;importcom.google.privacy.dlp.v2.ListJobTriggersRequest;importcom.google.privacy.dlp.v2.LocationName;importjava.io.IOException;classTriggersList{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";listTriggers(projectId);}publicstaticvoidlistTriggers(StringprojectId)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Build the request to be sent by the clientListJobTriggersRequestlistJobTriggersRequest=ListJobTriggersRequest.newBuilder().setParent(LocationName.of(projectId,"global").toString()).build();// Use the client to send the API request.DlpServiceClient.ListJobTriggersPagedResponseresponse=dlpServiceClient.listJobTriggers(listJobTriggersRequest);// Parse the response and process the resultsSystem.out.println("DLP triggers found:");for(JobTriggertrigger:response.getPage().getValues()){System.out.println("Trigger: "+trigger.getName());System.out.println("\tCreated: "+trigger.getCreateTime());System.out.println("\tUpdated: "+trigger.getUpdateTime());if(trigger.getDisplayName()!=null){System.out.println("\tDisplay name: "+trigger.getDisplayName());}if(trigger.getDescription()!=null){System.out.println("\tDescription: "+trigger.getDescription());}System.out.println("\tStatus: "+trigger.getStatus());System.out.println("\tError count: "+trigger.getErrorsCount());};}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'my-project'asyncfunctionlistTriggers(){// Construct trigger listing requestconstrequest={parent:`projects/${projectId}/locations/global`,};// Helper function to pretty-print datesconstformatDate=date=>{constmsSinceEpoch=parseInt(date.seconds,10)*1000;returnnewDate(msSinceEpoch).toLocaleString('en-US');};// Run trigger listing requestconst[triggers]=awaitdlp.listJobTriggers(request);triggers.forEach(trigger=>{// Log trigger detailsconsole.log(`Trigger${trigger.name}:`);console.log(` Created:${formatDate(trigger.createTime)}`);console.log(` Updated:${formatDate(trigger.updateTime)}`);if(trigger.displayName){console.log(` Display Name:${trigger.displayName}`);}if(trigger.description){console.log(` Description:${trigger.description}`);}console.log(` Status:${trigger.status}`);console.log(` Error count:${trigger.errors.length}`);});}listTriggers();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\ListJobTriggersRequest;/** * List Data Loss Prevention API job triggers. * * @param string $callingProjectId The project ID to run the API call under */function list_triggers(string $callingProjectId): void{ // Instantiate a client. $dlp = new DlpServiceClient(); $parent = "projects/$callingProjectId/locations/global"; // Run request $listJobTriggersRequest = (new ListJobTriggersRequest()) ->setParent($parent); $response = $dlp->listJobTriggers($listJobTriggersRequest); // Print results $triggers = $response->iterateAllElements(); foreach ($triggers as $trigger) { printf('Trigger %s' . PHP_EOL, $trigger->getName()); printf(' Created: %s' . PHP_EOL, $trigger->getCreateTime()->getSeconds()); printf(' Updated: %s' . PHP_EOL, $trigger->getUpdateTime()->getSeconds()); printf(' Display Name: %s' . PHP_EOL, $trigger->getDisplayName()); printf(' Description: %s' . PHP_EOL, $trigger->getDescription()); printf(' Status: %s' . PHP_EOL, $trigger->getStatus()); printf(' Error count: %s' . PHP_EOL, count($trigger->getErrors())); $timespanConfig = $trigger->getInspectJob()->getStorageConfig()->getTimespanConfig(); printf(' Auto-populates timespan config: %s' . PHP_EOL, ($timespanConfig && $timespanConfig->getEnableAutoPopulationOfTimespanConfig() ? 'yes' : 'no')); }}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importgoogle.cloud.dlpdeflist_triggers(project:str)->None:"""Lists all Data Loss Prevention API triggers. Args: project: The Google Cloud project id to use as a parent resource. Returns: None; the response from the API is printed to the terminal. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Convert the project id into a full resource id.parent=f"projects/{project}"# Call the API.response=dlp.list_job_triggers(request={"parent":parent})fortriggerinresponse:print(f"Trigger{trigger.name}:")print(f" Created:{trigger.create_time}")print(f" Updated:{trigger.update_time}")iftrigger.display_name:print(f" Display Name:{trigger.display_name}")iftrigger.description:print(f" Description:{trigger.description}")print(f" Status:{trigger.status}")print(f" Error count:{len(trigger.errors)}")REST
TheJobTrigger resource has aprojects.jobTriggers.listmethod, with which you can list all job triggers.
To list all job triggers currently defined in your project, send a GET requestto thejobTriggersendpoint, as shown here:
URL:
GET https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/jobTriggers?key={YOUR_API_KEY}The following JSON output lists the job trigger we created in the previoussection. Note that the structure of the job trigger mirrors that of theJobTriggerresource.
JSON output:
{"jobTriggers":[ { "name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]", "displayName":"JobTrigger1", "description":"StartsaninspectionofaDatastorekind", "inspectJob":{ "storageConfig":{ "datastoreOptions":{ "partitionId":{ "projectId":"[PROJECT_ID]", "namespaceId":"[NAMESPACE_ID]" }, "kind":{ "name":"Example-Kind" } } }, "inspectConfig":{ "infoTypes":[ { "name":"PHONE_NUMBER" } ], "minLikelihood":"LIKELY", "limits":{ }, "includeQuote":true }, "actions":[ { "saveFindings":{ "outputConfig":{ "table":{ "projectId":"[PROJECT_ID]", "datasetId":"[BIGQUERY_DATASET_NAME]", "tableId":"[BIGQUERY_TABLE_NAME]" } } } } ] }, "triggers":[ { "schedule":{ "recurrencePeriodDuration":"86400s" } } ], "createTime":"2018-11-30T01:52:41.171857Z", "updateTime":"2018-11-30T01:52:41.171857Z", "status":"HEALTHY" }, ...], "nextPageToken":"KkwKCQjivJ2UpPreAgo_Kj1wcm9qZWN0cy92ZWx2ZXR5LXN0dWR5LTE5NjEwMS9qb2JUcmlnZ2Vycy8xNTA5NzEyOTczMDI0MDc1NzY0"}To quickly try this out, you can use the API Explorer that's embedded below.For general information about using JSON to send requests to theDLP API, see theJSON quickstart.
Delete a job
To delete a job from your project, which includes its results, do the following.Any results saved externally (such as to BigQuery) are untouched bythis operation.
Console
In the Google Cloud console, go to the Sensitive Data Protection page.
Click theInspection tab, and then click theInspect jobs subtab. The Google Cloud console displays a list of all jobs for the current project.
In theActions column for the job trigger you want to delete, click themore actions menu (displayed as three dots arranged vertically), and then clickDelete.
Alternatively, from the list of jobs, click the identifier of the job you wantto delete. On the job's detail page, clickDelete.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingSystem;usingGoogle.Cloud.Dlp.V2;publicclassJobsDelete{publicstaticvoidDeleteJob(stringjobName){vardlp=DlpServiceClient.Create();dlp.DeleteDlpJob(newDeleteDlpJobRequest{Name=jobName});Console.WriteLine($"Successfully deleted job {jobName}.");}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// deleteJob deletes the job with the given name.funcdeleteJob(wio.Writer,jobNamestring)error{// jobName := "job-example"ctx:=context.Background()client,err:=dlp.NewClient(ctx)iferr!=nil{returnfmt.Errorf("dlp.NewClient: %w",err)}deferclient.Close()req:=&dlppb.DeleteDlpJobRequest{Name:jobName,}iferr=client.DeleteDlpJob(ctx,req);err!=nil{returnfmt.Errorf("DeleteDlpJob: %w",err)}fmt.Fprintf(w,"Successfully deleted job")returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.DeleteDlpJobRequest;importcom.google.privacy.dlp.v2.DlpJobName;importjava.io.IOException;publicclassJobsDelete{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";StringjobId="your-job-id";deleteJobs(projectId,jobId);}// Deletes a DLP Job with the given jobIdpublicstaticvoiddeleteJobs(StringprojectId,StringjobId)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Construct the complete job name from the projectId and jobIdDlpJobNamejobName=DlpJobName.of(projectId,jobId);// Construct the job deletion request to be sent by the client.DeleteDlpJobRequestdeleteDlpJobRequest=DeleteDlpJobRequest.newBuilder().setName(jobName.toString()).build();// Send the job deletion requestdlpServiceClient.deleteDlpJob(deleteDlpJobRequest);System.out.println("Job deleted successfully.");}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'my-project';// The name of the job whose results should be deleted// Parent project ID is automatically extracted from this parameter// const jobName = 'projects/my-project/dlpJobs/X-#####'functiondeleteJob(){// Construct job deletion requestconstrequest={name:jobName,};// Run job deletion requestdlp.deleteDlpJob(request).then(()=>{console.log(`Successfully deleted job${jobName}.`);}).catch(err=>{console.log(`Error in deleteJob:${err.message||err}`);});}deleteJob();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\DeleteDlpJobRequest;/** * Delete results of a Data Loss Prevention API job * * @param string $jobId The name of the job whose results should be deleted */function delete_job(string $jobId): void{ // Instantiate a client. $dlp = new DlpServiceClient(); // Run job-deletion request // The Parent project ID is automatically extracted from this parameter $deleteDlpJobRequest = (new DeleteDlpJobRequest()) ->setName($jobId); $dlp->deleteDlpJob($deleteDlpJobRequest); // Print status printf('Successfully deleted job %s' . PHP_EOL, $jobId);}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importgoogle.cloud.dlpdefdelete_dlp_job(project:str,job_name:str)->None:"""Uses the Data Loss Prevention API to delete a long-running DLP job. Args: project: The project id to use as a parent resource. job_name: The name of the DlpJob resource to be deleted. Returns: None; the response from the API is printed to the terminal. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Convert the project id and job name into a full resource id.name=f"projects/{project}/dlpJobs/{job_name}"# Call the API to delete job.dlp.delete_dlp_job(request={"name":name})print(f"Successfully deleted{job_name}")REST
To delete a job from the current project, send aDELETErequest to thedlpJobsendpoint, as shown here. Replace the[JOB-IDENTIFIER] field with theidentifier of the job, which starts withi-.
URL:
DELETE https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/dlpJobs/[JOB-IDENTIFIER]?key={YOUR_API_KEY}If the request was successful, the DLP API will return asuccess response. To verify the job was successfully deleted,listall jobs.
To quickly try this out, you can use the API Explorer that's embedded below.For general information about using JSON to send requests to theDLP API, see theJSON quickstart.
Delete a job trigger
Console
In the Google Cloud console, go to the Sensitive Data Protection page.
Go to Sensitive Data Protection
On theInspection tab, on theJob triggers subtab, the console displays a list of all job triggers for the current project.
In theActions column for the job trigger you want to delete, click themore actions menu (displayed as three dots arranged vertically), and then clickDelete.
Alternatively, from the list of job triggers, click the name of the job triggeryou want to delete. On the job trigger's detail page, clickDelete.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingGoogle.Cloud.Dlp.V2;usingSystem;publicclassTriggersDelete{publicstaticvoidDelete(stringtriggerName){vardlp=DlpServiceClient.Create();dlp.DeleteJobTrigger(newDeleteJobTriggerRequest{Name=triggerName});Console.WriteLine($"Successfully deleted trigger {triggerName}.");}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// deleteTrigger deletes the given trigger.funcdeleteTrigger(wio.Writer,triggerIDstring)error{// triggerID := "my-trigger"ctx:=context.Background()client,err:=dlp.NewClient(ctx)iferr!=nil{returnfmt.Errorf("dlp.NewClient: %w",err)}deferclient.Close()req:=&dlppb.DeleteJobTriggerRequest{Name:triggerID,}iferr:=client.DeleteJobTrigger(ctx,req);err!=nil{returnfmt.Errorf("DeleteJobTrigger: %w",err)}fmt.Fprintf(w,"Successfully deleted trigger %v",triggerID)returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.DeleteJobTriggerRequest;importcom.google.privacy.dlp.v2.ProjectJobTriggerName;importjava.io.IOException;classTriggersDelete{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";StringtriggerId="your-trigger-id";deleteTrigger(projectId,triggerId);}publicstaticvoiddeleteTrigger(StringprojectId,StringtriggerId)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Get the full trigger name from the given triggerId and ProjectIdProjectJobTriggerNametriggerName=ProjectJobTriggerName.of(projectId,triggerId);// Construct the trigger deletion request to be sent by the clientDeleteJobTriggerRequestdeleteJobTriggerRequest=DeleteJobTriggerRequest.newBuilder().setName(triggerName.toString()).build();// Send the trigger deletion requestdlpServiceClient.deleteJobTrigger(deleteJobTriggerRequest);System.out.println("Trigger deleted: "+triggerName.toString());}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'my-project'// The name of the trigger to be deleted// Parent project ID is automatically extracted from this parameter// const triggerId = 'projects/my-project/triggers/my-trigger';asyncfunctiondeleteTrigger(){// Construct trigger deletion requestconstrequest={name:triggerId,};// Run trigger deletion requestawaitdlp.deleteJobTrigger(request);console.log(`Successfully deleted trigger${triggerId}.`);}deleteTrigger();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\DeleteJobTriggerRequest;/** * Delete a Data Loss Prevention API job trigger. * * @param string $callingProjectId The project ID to run the API call under * @param string $triggerId The name of the trigger to be deleted. */function delete_trigger(string $callingProjectId, string $triggerId): void{ // Instantiate a client. $dlp = new DlpServiceClient(); // Run request // The Parent project ID is automatically extracted from this parameter $triggerName = "projects/$callingProjectId/locations/global/jobTriggers/$triggerId"; $deleteJobTriggerRequest = (new DeleteJobTriggerRequest()) ->setName($triggerName); $dlp->deleteJobTrigger($deleteJobTriggerRequest); // Print the results printf('Successfully deleted trigger %s' . PHP_EOL, $triggerName);}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importgoogle.cloud.dlpdefdelete_trigger(project:str,trigger_id:str)->None:"""Deletes a Data Loss Prevention API trigger. Args: project: The id of the Google Cloud project which owns the trigger. trigger_id: The id of the trigger to delete. Returns: None; the response from the API is printed to the terminal. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Convert the project id into a full resource id.parent=f"projects/{project}"# Combine the trigger id with the parent id.trigger_resource=f"{parent}/jobTriggers/{trigger_id}"# Call the API.dlp.delete_job_trigger(request={"name":trigger_resource})print(f"Trigger{trigger_resource} successfully deleted.")REST
To delete a job trigger from the current project, send aDELETErequest to thejobTriggersendpoint, as shown here. Replace the[JOB-TRIGGER-NAME] field with the nameof the job trigger.
URL:
DELETE https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/jobTriggers/[JOB-TRIGGER-NAME]?key={YOUR_API_KEY}If the request was successful, the DLP API will return asuccess response. To verify the job trigger was successfully deleted,listall job triggers.
To quickly try this out, you can use the API Explorer that's embedded below.For general information about using JSON to send requests to theDLP API, see theJSON quickstart.
Get a job
To get a job from your project, which includes its results, do the following.Any results saved externally (such as to BigQuery) are untouched bythis operation.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingGoogle.Cloud.Dlp.V2;usingSystem;publicclassJobsGet{publicstaticDlpJobGetDlpJob(stringjobName){vardlp=DlpServiceClient.Create();varresponse=dlp.GetDlpJob(jobName);Console.WriteLine($"Job: {response.Name} status: {response.State}");returnresponse;}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb")// jobsGet gets an inspection job using jobNamefuncjobsGet(wio.Writer,projectIDstring,jobNamestring)error{// projectId := "my-project-id"// jobName := "your-job-id"ctx:=context.Background()// Initialize a client once and reuse it to send multiple requests. Clients// are safe to use across goroutines. When the client is no longer needed,// call the Close method to cleanup its resources.client,err:=dlp.NewClient(ctx)iferr!=nil{returnerr}// Closing the client safely cleans up background resources.deferclient.Close()// Construct the request to be sent by the client.req:=&dlppb.GetDlpJobRequest{Name:jobName,}// Send the request.resp,err:=client.GetDlpJob(ctx,req)iferr!=nil{returnerr}// Print the results.fmt.Fprintf(w,"Job Name: %v Job Status: %v",resp.Name,resp.State)returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.DlpJobName;importcom.google.privacy.dlp.v2.GetDlpJobRequest;importjava.io.IOException;publicclassJobsGet{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.StringprojectId="your-project-id";StringjobId="your-job-id";getJobs(projectId,jobId);}// Gets a DLP Job with the given jobIdpublicstaticvoidgetJobs(StringprojectId,StringjobId)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Construct the complete job name from the projectId and jobIdDlpJobNamejobName=DlpJobName.of(projectId,jobId);// Construct the get job request to be sent by the client.GetDlpJobRequestgetDlpJobRequest=GetDlpJobRequest.newBuilder().setName(jobName.toString()).build();// Send the get job requestdlpServiceClient.getDlpJob(getDlpJobRequest);System.out.println("Job got successfully.");}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlp=newDLP.DlpServiceClient();// Job name to look for// const jobName = 'your-job-name';asyncfunctiongetJob(){// Construct request for finding job using job name.constrequest={name:jobName,};// Send the request and receive response from the serviceconst[job]=awaitdlp.getDlpJob(request);// Print results.console.log(`Job${job.name} status:${job.state}`);}getJob();PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\GetDlpJobRequest;/** * Get DLP inspection job. * @param string $jobName Dlp job name */function get_job( string $jobName): void { // Instantiate a client. $dlp = new DlpServiceClient(); try { // Send the get job request $getDlpJobRequest = (new GetDlpJobRequest()) ->setName($jobName); $response = $dlp->getDlpJob($getDlpJobRequest); printf('Job %s status: %s' . PHP_EOL, $response->getName(), $response->getState()); } finally { $dlp->close(); }}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importgoogle.cloud.dlpdefget_dlp_job(project:str,job_name:str)->None:"""Uses the Data Loss Prevention API to retrieve a DLP job. Args: project: The project id to use as a parent resource. job_name: The name of the DlpJob resource to be retrieved. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Convert the project id and job name into a full resource id.job_name=f"projects/{project}/locations/global/dlpJobs/{job_name}"# Call the APIresponse=dlp.get_dlp_job(request={"name":job_name})print(f"Job:{response.name} Status:{response.state}")REST
To get a job from the current project, send aGETrequest to thedlpJobsendpoint, as shown here. Replace the[JOB-IDENTIFIER] field with theidentifier of the job, which starts withi-.
URL:
GET https://dlp.googleapis.com/v2/projects/[PROJECT-ID]/dlpJobs/[JOB-IDENTIFIER]?key={YOUR_API_KEY}If the request was successful, the DLP API will return asuccess response.
To quickly try this out, you can use the API Explorer that's embedded below.For general information about using JSON to send requests to theDLP API, see theJSON quickstart.
Force an immediate run of a job trigger
After a job trigger is created, you can force an immediate executionof the trigger for testing by activating it. To do so, run the followingcommand:
curl --request POST \ -H "Content-Type: application/json" \ -H "Accept: application/json" \ -H "Authorization: Bearer $(gcloud auth print-access-token)" \ -H "X-Goog-User-Project:PROJECT_ID" \ 'https://dlp.googleapis.com/v2/JOB_TRIGGER_NAME:activate'Replace the following:
- PROJECT_ID: theID of the Google Cloudproject to bill for access charges associated withthe request.
- JOB_TRIGGER_NAME: the full resource name of the jobtrigger—for example,
projects/my-project/locations/global/jobTriggers/123456789.
Update an existing job trigger
In addition to creating, listing, and deleting job triggers, you can alsoupdate an existing job trigger. To change the configuration for an existing jobtrigger:
Console
In the Google Cloud console, go to the Sensitive Data Protection page.
Click theInspection tab, and then click theJob triggers subtab.
The console displays a list of all job triggers for the current project.
In theActions column for the job trigger you want to delete, clickMoremore_vert, then clickView details.
On the job trigger detail page, clickEdit.
On the Edit trigger page, you can change the location of the input data; detection details such as templates, infoTypes, or likelihood; any post-scan actions, and the job trigger's schedule. When you're done making changes, clickSave.
C#
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
usingGoogle.Cloud.Dlp.V2;usingGoogle.Protobuf.WellKnownTypes;usingSystem;usingSystem.Collections.Generic;publicclassTriggersUpdate{publicstaticJobTriggerUpdateJob(stringprojectId,stringtriggerId,IEnumerable<InfoType>infoTypes=null,LikelihoodminLikelihood=Likelihood.Likely){// Instantiate the client.vardlp=DlpServiceClient.Create();// Construct the update job trigger request object by providing the trigger name,// job trigger object which will specify the type of info to be inspected and// update mask object which specifies the field to be updated.// Refer to https://cloud.google.com/dlp/docs/reference/rest/v2/Container for specifying the paths in container object.varrequest=newUpdateJobTriggerRequest{JobTriggerName=newJobTriggerName(projectId,triggerId),JobTrigger=newJobTrigger{InspectJob=newInspectJobConfig{InspectConfig=newInspectConfig{InfoTypes={infoTypes??newInfoType[]{newInfoType{Name="US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER"}}},MinLikelihood=minLikelihood}}},// Specify fields of the jobTrigger resource to be updated when the job trigger is modified.// Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask for constructing the field mask paths.UpdateMask=newFieldMask{Paths={"inspect_job.inspect_config.info_types","inspect_job.inspect_config.min_likelihood"}}};// Call the API.JobTriggerresponse=dlp.UpdateJobTrigger(request);// Inspect the result.Console.WriteLine($"Job Trigger Name: {response.Name}");Console.WriteLine($"InfoType updated: {response.InspectJob.InspectConfig.InfoTypes[0]}");Console.WriteLine($"Likelihood updated: {response.InspectJob.InspectConfig.MinLikelihood}");returnresponse;}}Go
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
import("context""fmt""io"dlp"cloud.google.com/go/dlp/apiv2""cloud.google.com/go/dlp/apiv2/dlppb""google.golang.org/protobuf/types/known/fieldmaskpb")// updateTrigger updates an existing job trigger in Google Cloud Data Loss Prevention (DLP).// It modifies the configuration of the specified job trigger with the provided updated settings.funcupdateTrigger(wio.Writer,jobTriggerNamestring)error{// jobTriggerName := "your-job-trigger-name" (projects/<projectID>/locations/global/jobTriggers/my-trigger)ctx:=context.Background()// Initialize a client once and reuse it to send multiple requests. Clients// are safe to use across goroutines. When the client is no longer needed,// call the Close method to cleanup its resources.client,err:=dlp.NewClient(ctx)iferr!=nil{returnerr}// Closing the client safely cleans up background resources.deferclient.Close()// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesinfoType:=&dlppb.InfoType{Name:"PERSON_NAME",}// Specify the inspectConfig that represents the configuration settings for inspecting sensitive data in// DLP API. It includes detection types, custom info types, inspection methods, and actions// to be taken on detection.inspectConfig:=&dlppb.InspectConfig{InfoTypes:[]*dlppb.InfoType{infoType,},MinLikelihood:dlppb.Likelihood_LIKELY,}// Configure the inspection job we want the service to perform.inspectJobConfig:=&dlppb.InspectJobConfig{InspectConfig:inspectConfig,}// Specify the jobTrigger that represents a DLP job trigger configuration.// It defines the conditions, actions, and schedule for executing inspections// on sensitive data in the specified data storage.jobTrigger:=&dlppb.JobTrigger{Job:&dlppb.JobTrigger_InspectJob{InspectJob:inspectJobConfig,},}// fieldMask represents a set of fields to be included in an update operation.// It is used to specify which fields of a resource should be updated.updateMask:=&fieldmaskpb.FieldMask{Paths:[]string{"inspect_job.inspect_config.info_types","inspect_job.inspect_config.min_likelihood"},}// Combine configurations into a request for the service.req:=&dlppb.UpdateJobTriggerRequest{Name:jobTriggerName,JobTrigger:jobTrigger,UpdateMask:updateMask,}// Send the scan request and process the responseresp,err:=client.UpdateJobTrigger(ctx,req)iferr!=nil{returnerr}// Print the result.fmt.Fprintf(w,"Successfully Updated trigger: %v",resp)returnnil}Java
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
importcom.google.cloud.dlp.v2.DlpServiceClient;importcom.google.privacy.dlp.v2.InfoType;importcom.google.privacy.dlp.v2.InspectConfig;importcom.google.privacy.dlp.v2.InspectJobConfig;importcom.google.privacy.dlp.v2.JobTrigger;importcom.google.privacy.dlp.v2.JobTriggerName;importcom.google.privacy.dlp.v2.Likelihood;importcom.google.privacy.dlp.v2.UpdateJobTriggerRequest;importcom.google.protobuf.FieldMask;importjava.io.IOException;publicclassTriggersPatch{publicstaticvoidmain(String[]args)throwsException{// TODO(developer): Replace these variables before running the sample.// The Google Cloud project id to use as a parent resource.StringprojectId="your-project-id";// The name of the job trigger to be updated.StringjobTriggerName="your-job-trigger-name";patchTrigger(projectId,jobTriggerName);}// Uses the Data Loss Prevention API to update an existing job trigger.publicstaticvoidpatchTrigger(StringprojectId,StringjobTriggerName)throwsIOException{// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(DlpServiceClientdlpServiceClient=DlpServiceClient.create()){// Specify the type of info the inspection will look for.// See https://cloud.google.com/dlp/docs/infotypes-reference for complete list of info typesInfoTypeinfoType=InfoType.newBuilder().setName("PERSON_NAME").build();InspectConfiginspectConfig=InspectConfig.newBuilder().addInfoTypes(infoType).setMinLikelihood(Likelihood.LIKELY).build();InspectJobConfiginspectJobConfig=InspectJobConfig.newBuilder().setInspectConfig(inspectConfig).build();JobTriggerjobTrigger=JobTrigger.newBuilder().setInspectJob(inspectJobConfig).build();// Specify fields of the jobTrigger resource to be updated when the job trigger is modified.// Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask for constructing the field mask paths.FieldMaskfieldMask=FieldMask.newBuilder().addPaths("inspect_job.inspect_config.info_types").addPaths("inspect_job.inspect_config.min_likelihood").build();// Update the job trigger with the new configuration.UpdateJobTriggerRequestupdateJobTriggerRequest=UpdateJobTriggerRequest.newBuilder().setName(JobTriggerName.of(projectId,jobTriggerName).toString()).setJobTrigger(jobTrigger).setUpdateMask(fieldMask).build();// Call the API to update the job trigger.JobTriggerupdatedJobTrigger=dlpServiceClient.updateJobTrigger(updateJobTriggerRequest);System.out.println("Job Trigger Name: "+updatedJobTrigger.getName());System.out.println("InfoType updated: "+updatedJobTrigger.getInspectJob().getInspectConfig().getInfoTypes(0).getName());System.out.println("Likelihood updated: "+updatedJobTrigger.getInspectJob().getInspectConfig().getMinLikelihood());}}}Node.js
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
// Imports the Google Cloud Data Loss Prevention libraryconstDLP=require('@google-cloud/dlp');// Instantiates a clientconstdlpClient=newDLP.DlpServiceClient();// The project ID to run the API call under// const projectId = 'my-project';// The job trigger ID to run the API call under// const jobTriggerName = 'your-job-trigger-name';asyncfunctionupdateTrigger(){// Construct inspect configuration to match PERSON_NAME infotypeconstinspectConfig={infoTypes:[{name:'PERSON_NAME'}],minLikelihood:'LIKELY',};// Configure the job trigger we want to update.constjobTrigger={inspectJob:{inspectConfig}};constupdateMask={paths:['inspect_job.inspect_config.info_types','inspect_job.inspect_config.min_likelihood',],};// Combine configurations into a request for the service.constrequest={name:`projects/${projectId}/jobTriggers/${jobTriggerName}`,jobTrigger,updateMask,};// Send the request and receive response from the serviceconst[updatedJobTrigger]=awaitdlpClient.updateJobTrigger(request);// Print the resultsconsole.log(`Updated Trigger:${JSON.stringify(updatedJobTrigger)}`);}updateTrigger(projectId,jobTriggerName);PHP
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
use Google\Cloud\Dlp\V2\Client\DlpServiceClient;use Google\Cloud\Dlp\V2\InfoType;use Google\Cloud\Dlp\V2\InspectConfig;use Google\Cloud\Dlp\V2\InspectJobConfig;use Google\Cloud\Dlp\V2\JobTrigger;use Google\Cloud\Dlp\V2\Likelihood;use Google\Cloud\Dlp\V2\UpdateJobTriggerRequest;use Google\Protobuf\FieldMask;/** * Update an existing job trigger. * * @param string $callingProjectId The Google Cloud Project ID to run the API call under. * @param string $jobTriggerName The job trigger name to update. * */function update_trigger( string $callingProjectId, string $jobTriggerName): void { // Instantiate a client. $dlp = new DlpServiceClient(); // Configure the inspectConfig. $inspectConfig = (new InspectConfig()) ->setInfoTypes([ (new InfoType()) ->setName('US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER') ]) ->setMinLikelihood(Likelihood::LIKELY); // Configure the Job Trigger we want the service to perform. $jobTrigger = (new JobTrigger()) ->setInspectJob((new InspectJobConfig()) ->setInspectConfig($inspectConfig)); // Specify fields of the jobTrigger resource to be updated when the job trigger is modified. // Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask for constructing the field mask paths. $fieldMask = (new FieldMask()) ->setPaths([ 'inspect_job.inspect_config.info_types', 'inspect_job.inspect_config.min_likelihood' ]); // Send the update job trigger request and process the response. $name = "projects/$callingProjectId/locations/global/jobTriggers/" . $jobTriggerName; $updateJobTriggerRequest = (new UpdateJobTriggerRequest()) ->setName($name) ->setJobTrigger($jobTrigger) ->setUpdateMask($fieldMask); $response = $dlp->updateJobTrigger($updateJobTriggerRequest); // Print results. printf('Successfully update trigger %s' . PHP_EOL, $response->getName());}Python
To learn how to install and use the client library for Sensitive Data Protection, seeSensitive Data Protection client libraries.
To authenticate to Sensitive Data Protection, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.
fromtypingimportListimportgoogle.cloud.dlpdefupdate_trigger(project:str,info_types:List[str],trigger_id:str,)->None:"""Uses the Data Loss Prevention API to update an existing job trigger. Args: project: The Google Cloud project id to use as a parent resource info_types: A list of strings representing infoTypes to update trigger with. A full list of infoType categories can be fetched from the API. trigger_id: The id of job trigger which needs to be updated. """# Instantiate a client.dlp=google.cloud.dlp_v2.DlpServiceClient()# Prepare info_types by converting the list of strings into a list of# dictionaries.info_types=[{"name":info_type}forinfo_typeininfo_types]# Specify fields of the jobTrigger resource to be updated when the# job trigger is modified.job_trigger={"inspect_job":{"inspect_config":{"info_types":info_types,"min_likelihood":google.cloud.dlp_v2.Likelihood.LIKELY,}}}# Convert the project id into a full resource id.trigger_name=f"projects/{project}/jobTriggers/{trigger_id}"# Call the API.# Refer https://protobuf.dev/reference/protobuf/google.protobuf/#field-mask# for constructing the field mask paths.response=dlp.update_job_trigger(request={"name":trigger_name,"job_trigger":job_trigger,"update_mask":{"paths":["inspect_job.inspect_config.info_types","inspect_job.inspect_config.min_likelihood",]},})# Print out the result.print(f"Successfully updated trigger:{response.name}")print(f"Updated InfoType:{response.inspect_job.inspect_config.info_types[0].name}"f"\nUpdates Likelihood:{response.inspect_job.inspect_config.min_likelihood}\n",)REST
Use theprojects.jobTriggers.patchmethod to send newJobTrigger values to the DLP APIto update those values within a specified job trigger.
US_SOCIAL_SECURITY_NUMBER infoType detector. Ifyou patch the job trigger and specify theUS_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER infoType detector, the patchedjob trigger will only contain theUS_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBERdetector. To keep the original infoType detector in the job trigger, specifyboth in the patch request.For example, consider the following simple job trigger. This JSON representsthe job trigger, and was returned after sending a GET request to the currentproject's job trigger endpoint.
JSON output:
{"name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]","inspectJob":{"storageConfig":{"cloudStorageOptions":{"fileSet":{"url":"gs://dlptesting/*"},"fileTypes":[ "FILE_TYPE_UNSPECIFIED"],"filesLimitPercent":100},"timespanConfig":{"enableAutoPopulationOfTimespanConfig":true}},"inspectConfig":{"infoTypes":[ { "name":"US_SOCIAL_SECURITY_NUMBER" }],"minLikelihood":"POSSIBLE","limits":{}},"actions":[ { "jobNotificationEmails":{ } }]},"triggers":[ { "schedule":{ "recurrencePeriodDuration":"86400s" } }],"createTime":"2019-03-06T21:19:45.774841Z","updateTime":"2019-03-06T21:19:45.774841Z","status":"HEALTHY"}The following JSON, when sent with a PATCH request to the specified endpoint,updates the given job trigger with a new infoType to scan for, as well as a newminimum likelihood. Note that you must also specify theupdateMask attribute,and that its value is inFieldMaskformat.
JSON input:
PATCHhttps://dlp.googleapis.com/v2/projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]?key={YOUR_API_KEY}{"jobTrigger":{"inspectJob":{"inspectConfig":{"infoTypes":[ { "name":"US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER" }],"minLikelihood":"LIKELY"}}},"updateMask":"inspectJob(inspectConfig(infoTypes,minLikelihood))"}After you send this JSON to the specified URL, it returns the following, whichrepresents the updated job trigger. Note that the original infoType andlikelihood values have been replaced by the new values.
JSON output:
{"name":"projects/[PROJECT_ID]/jobTriggers/[JOB_TRIGGER_NAME]","inspectJob":{"storageConfig":{"cloudStorageOptions":{"fileSet":{"url":"gs://dlptesting/*"},"fileTypes":[ "FILE_TYPE_UNSPECIFIED"],"filesLimitPercent":100},"timespanConfig":{"enableAutoPopulationOfTimespanConfig":true}},"inspectConfig":{"infoTypes":[ { "name":"US_INDIVIDUAL_TAXPAYER_IDENTIFICATION_NUMBER" }],"minLikelihood":"LIKELY","limits":{}},"actions":[ { "jobNotificationEmails":{ } }]},"triggers":[ { "schedule":{ "recurrencePeriodDuration":"86400s" } }],"createTime":"2019-03-06T21:19:45.774841Z","updateTime":"2019-03-06T21:27:01.650183Z","lastRunTime":"1970-01-01T00:00:00Z","status":"HEALTHY"}To quickly try this out, you can use the API Explorer that's embedded below.For general information about using JSON to send requests to theDLP API, see theJSON quickstart.
Job latency
There are no service level objectives (SLO) guaranteed for jobs and jobtriggers. Latency is affected by several factors, including the amount of datato scan, the storage repository being scanned, the type and number of infoTypesyou are scanning for, the region where the job is processed, and the computingresources available in that region. Therefore, the latency of inspection jobscan't be determined in advance.
To help reduce job latency, you can try the following:
- Ifsampling is availablefor your job or job trigger, enable it.
Avoid enabling infoTypes that you don't need. Although the following areuseful in certain scenarios, these infoTypes can make requests run much moreslowly than requests that don't include them:
PERSON_NAMEFEMALE_NAMEMALE_NAMEFIRST_NAMELAST_NAMEDATE_OF_BIRTHLOCATIONSTREET_ADDRESSORGANIZATION_NAME
Always specify infoTypes explicitly. Do not use an empty infoTypes list.
If possible, use a different processing region.
If you're still having latency issues with jobs after trying these techniques,consider usingcontent.inspect orcontent.deidentifyrequests instead of jobs. These methods are covered under the Service LevelAgreement. For more information, seeSensitive Data Protection Service LevelAgreement.
Limit scans to only new content
You can configure your job trigger to automatically set the timespan date forfiles stored inCloud Storage orBigQuery. When you set theTimespanConfigobject to auto-populate, Sensitive Data Protection only scans data that wasadded or modified since the trigger last ran:
... timespan_config { enable_auto_population_of_timespan_config: true }...For BigQuery inspection, only rows that are at least three hours oldare included in the scan. See theknownissue related to thisoperation.
Trigger jobs at file upload
In addition to the support for job triggers—which is built intoSensitive Data Protection—Google Cloud also has a variety of othercomponents that you can use to integrate or trigger Sensitive Data Protectionjobs. For example, you can useCloud Run functions totrigger a Sensitive Data Protection scan every time a file is uploaded toCloud Storage.
For information about how to set up this operation, seeAutomating theclassification of data uploaded toCloud Storage.
Successful jobs with no data inspected
A job can complete successfully even if no data was scanned. The followingexample scenarios can cause this to happen:
- The job is configured to inspect a specific data asset, such as a file, thatexists but is empty.
- The job is configured to inspect a data asset that doesn't exist or that nolonger exists.
- The job is configured to inspect a Cloud Storage bucket that is empty.
- The job is configured to inspect a bucket, and recursive scanning is disabled.At the top level, the bucket contains only folders that, in turn, contain thefiles.
- The job is configured to inspect only a specific file type in a bucket, butthe bucket doesn't have any files of that type.
- The job is configured toinspect only newcontent, but there were no updates afterthe last time the job was run.
In the Google Cloud console, on theJob details page, theBytes scannedfield specifies how much data was inspected by the job. In theDLP API, theprocessedBytes fieldspecifies how much data was inspected.
What's next
- Learn more aboutcreating a de-identified copy of data instorage.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.