Quotas and limits Stay organized with collections Save and categorize content based on your preferences.
This document contains current restrictions and usage limits when usingSensitive Data Protection. For billing information, see thepricing page.
These limits apply to each Google Cloud console project and are shared acrossall applications and IP addresses using that project.
You can set lower quotas in theGoogle Cloud console.
Note: Quotas and limits specified in this topic are subject to change.Rate quotas
This table highlights important quotas for Sensitive Data Protection requestsin each project. For other quotas, see theQuotas & System Limits page inthe Google Cloud console.
| Quota | Value | Description |
|---|---|---|
| Number of requests per minute | 10,000 | Total number of requests to allglobal and regional endpoints (REP) for Sensitive Data Protection across all locations |
| Number of requests to a regional endpoint per minute per region | 600 | Requests made to the global endpoint (dlp.googleapis.com)where a location is specified |
| Number of requests to a regional REP endpoint per minute per region | 100 | Requests made to aregional endpoint (dlp.REP_REGION.rep.googleapis.com) |
Resource limits
Sensitive Data Protection enforces the following limits on stored resourcesper project.
| Resource | Limit |
|---|---|
| Maximum number oftemplates | 1000 |
| Maximum number ofjob triggers | 1000 |
| Maximum number of running jobs | 1000 |
| Maximum number ofstored infoTypes | 30 |
| Maximum number ofdiscovery configurations | 100 |
Content inspection and de-identification limits
Sensitive Data Protection enforces the following usage limits forinspecting andde-identifying content sent directly to the DLP API as text or images:
| Type of limit | Usage limit |
|---|---|
| Maximum number ofregular custom dictionaries per request | 10 |
| Maximum size of eachquote (a contextual snippet, returned with findings, of the text that triggered a match) | 4 KB |
| Maximum number of table values | 50,000 |
| Maximum number of transformations per request | 100 |
| Maximum number ofinspection rules perset | 10 |
| Maximum number of inspection rule sets perinspection configuration | 10 |
| Maximum number of findings per request | 3,000 |
Maximum size of each request, exceptprojects.image.redact | 0.5 MB |
Maximum size of eachprojects.image.redact request | 4 MB |
If you need to inspect files that are larger than these limits, store thosefiles on Cloud Storage and run aninspection job.
Storage inspection limits
Sensitive Data Protection enforces the following usage limits forinspectingGoogle Cloud storage repositories:
| Type of limit | Usage limit |
|---|---|
| Maximum total scan size | 2 TB |
| Maximum size of eachquote (a contextual snippet, returned with findings, of the text that triggered a match) | 4 KB |
Storage de-identification limits
Sensitive Data Protection enforces the following usage limit when youde-identify data in storage:
| Type of limit | Usage limit |
|---|---|
| Maximum file size | 60,000 KB |
Data profiling limits
Sensitive Data Protection enforces the following usage limits forprofilingdata.
These limits apply globally for all data configurations at both organization andproject levels.
| Type of limit | Usage limit |
|---|---|
| Maximum number of schedules per discovery scan configuration | 100 |
| Maximum number of filters per schedule | 100 |
| Subscription capacity used for profiling | SeeSubscription capacity used for profiling on this page. |
Subscription capacity used for profiling
This value depends on the number of subscription units that you purchased. Forexample, if you purchased 5 subscription units, you get 50,000 tokens for thewhole month. Therefore, you can consume around 1,667 tokens per day (that is,50,000 tokens divided by 30 days).
To review your subscription capacity usage, go to theSubscriptions page and clickReview capacity usage.
For information about how to calculate how many profiles can be generated byyour tokens, seeData resources profiled per subscriptionunit.
BigQuery profiling limits
| Type of limit | Usage limit |
|---|---|
| Maximum number of BigQuery tables | 200,000 |
| Maximum sum of columns in all BigQuery tables to be profiled | 20,000,000 |
Example: Organization-level limits
Suppose you have an organization that has two folders, and youcreate a scan configuration for each folder.The total number of tables to be profiled from both of the configurationsmust not exceed 200,000. The total number of columns in all thosetables must not exceed 20,000,000.
Example: Project-level limits
If you configure data profiling at the project level, then the total number oftables in that project must not exceed 200,000. The total number ofcolumns in all those tables must not exceed 20,000,000.
Cloud SQL profiling limits
| Type of limit | Usage limit |
|---|---|
| Maximum number of databases per instance | 1,000 |
| Maximum number of tables per database | 20,000 |
Custom infoType limits
Sensitive Data Protection enforces the following limits forcustominfoTypes.
| Type of limit | Usage limit |
|---|---|
| Maximum size of word list passed directly in the request message per regular custom dictionary | 128 KB |
| Maximum size of word list specified as a file in Cloud Storage per regular custom dictionary | 512 KB |
| Maximum number of components (continuous sequences containing only letters, only digits, only non-letter characters, or only non-digit characters) per regular custom dictionary phrase | 40 |
| Maximum combined size of allstored custom dictionaries per request | 5 MB |
| Maximum number of built-in andcustom infoTypes per request | 150 |
| Maximum number ofdetection rules per custom infoType | 5 |
| Maximum number ofcustom infoTypes per request | 30 |
| Maximum number ofregular custom dictionaries per request | 10 |
| Maximum length of regular expressions | 1000 |
Stored infoType limits
Sensitive Data Protection enforces the following limits forcreating storedinfoTypes.
| Type of limit | Usage limit |
|---|---|
| Maximum size of a singleinput file stored in Cloud Storage | 200 MB |
| Maximum combined size of all input files stored in Cloud Storage | 1 GB |
| Maximum number of input files stored in Cloud Storage | 100 |
| Maximum size of an input column in BigQuery | 1 GB |
| Maximum number of input table rows in BigQuery | 5,000,000 |
| Maximum size ofoutput files | 500 MB |
Avro scanning limits
Generally, Avro files have the same limitations for Sensitive Data Protection andBigQuery. If these limits are hit, binary scanning is used as afallback. The following limits apply to inspect content requests and inspectstorage jobs for Avro files:
| Type of limit | Usage limit |
|---|---|
| Maximum size for a single Avro block | 100 MB |
| Maximum size for a single Avro file | 1 TB |
| Maximum number of columns in an Avro file | 10,000 |
| Maximum level of nested fields | 15 |
Scanning limits for PDFs and Microsoft products
These limits apply when scanning the following types of files:
- Microsoft Word
- Microsoft Excel
- Microsoft Powerpoint
If these limits are hit, binary scanning is used as a fallback.
| Type of Limit | Usage Limit |
|---|---|
| Maximum size of a single PDF in Cloud Storage. Files exceeding this limit are binary scanned. | 150 MB, up to 10,000 pages |
| Maximum size of a single Word file in Cloud Storage. Files exceeding this limit are binary scanned. | 30 MB |
| Maximum size of a single Excel file in Cloud Storage. Files exceeding this limit are binary scanned. | 30 MB |
| Maximum size of a single Powerpoint file in Cloud Storage. Files exceeding this limit are binary scanned. | 30 MB |
These limits apply on these file types even if you set amaximum byte size perfile.
Quota increases
You can edit your quotas up to their maximum values on theQuotas & System Limits page for yourproject.To request an increase in quota, edit your quota with yourrequested increase and justification and submit your update. You arenotified when your request is received. You might be contacted for moreinformation regarding your request. After your request is reviewed, youare notified whether it has been approved or denied.
Quota dependencies
Depending on which features you are using, you may also need additional quota.
- Pub/Sub: If you are using Pub/Sub, you may need additionalquota.
- BigQuery:
- Streaming API is used for persisting findings forinspect jobs, wherequota values and other restrictionsapply.
google.cloud.bigquery.storage.v1beta1.BigQueryStorage,which lists the contents of rows in a table, issubject to quotalimits.
Learn about service disruptions
Sensitive Data Protection has features that depend on otherGoogle Cloud services. Because of these dependencies, you can expectSensitive Data Protection to have comparable reliability as those products.
We make a best effort to retry until any recurring errors have subsided, butdegraded experiences may occur if those services experience disruptions.
Check theGoogle Cloud Status Dashboard forall known service disruptions. You can also subscribe to theGoogle Cloud Status Dashboard updatesJSON feed orRSS feed for push updates.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.