Troubleshoot quota and limit errors
BigQuery has variousquotas and limitsthat limit the rate and volume of different requests and operations. They existboth to protect the infrastructure and to help guard against unexpectedcustomer usage. This document describes how to diagnose and mitigatespecific errors resulting from quotas and limits.
Some error messages specify quotas or limits that you can increase, while othererror messages specify quotas or limits that you can't increase. Reaching a hardlimit means that you need to implement temporary or permanent workarounds orbest practices for your workload. Doing so is a best practice, even for quotasor limits that can be increased.
This document organizes error messages and their solutions according to thesecategories, and the "Overview" section later in this document explains how toread an error message and apply the correct solution for your issue.
If your error message is not listed in this document, then refer tothe list of error messages, which has moregeneric error information.
Overview
If a BigQuery operation fails because of exceeding a quota, the APIreturns the HTTP403 Forbidden status code. The response body contains moreinformation about the quota that was reached. The response body looks similar tothe following:
{"code":403,"errors":[{"domain":"global","message":"Quota exceeded: ...","reason":"quotaExceeded"}],"message":"Quota exceeded: ..."}Themessage field in the payload describes which limit was exceeded. Forexample, themessage field might sayExceeded rate limits: too many tableupdate operations for this table.
In general, quota limits fall into two categories, indicated by thereasonfield in the response payload.
rateLimitExceeded. This value indicates a short-termlimit. To resolve these limit issues, retry the operation after a few seconds.Useexponential backoff between retry attempts. That is, exponentiallyincrease the delay between each retry.quotaExceeded. This value indicates a longer-term limit. If you reach alonger-term quota limit, you should wait 10 minutes or longer before tryingthe operation again. If you consistently reach one of these longer-termquota limits, you should analyze your workload for ways to mitigate theissue. Mitigations can include optimizing your workload or requesting aquota increase.
ForquotaExceeded errors, examine the error message to understand which quotalimit was exceeded. Then, analyze your workload to see if you can avoid reachingthe quota.
In some cases, the quota can be raised bycontacting BigQuery support orcontacting Google Cloud sales,but we recommend trying the suggestions in this document first.
Diagnosis
To diagnose issues, do the following:
Use
INFORMATION_SCHEMAviews along with aregion qualifier to analyze the underlying issue. These views contain metadata about your BigQueryresources, including jobs, reservations, and streaming inserts.For example, the following query uses the
INFORMATION_SCHEMA.JOBSview to list allquota-related errors within the past day:SELECTjob_id,creation_time,error_resultFROM`region-REGION_NAME`.INFORMATION_SCHEMA.JOBSWHEREcreation_time>TIMESTAMP_SUB(CURRENT_TIMESTAMP,INTERVAL1DAY)ANDerror_result.reasonIN('rateLimitExceeded','quotaExceeded')
Replace
REGION_NAMEwith the region of theproject. It must be preceded byregion-. For example, for theUSmulti-region, useregion-us.View errors in Cloud Audit Logs.
For example, usingLogs Explorer, the following queryreturns errors with either
Quota exceededorlimitin the message string:resource.type = ("bigquery_project" OR "bigquery_dataset")protoPayload.status.code ="7"protoPayload.status.message: ("Quota exceeded" OR "limit")In this example, the status code
7indicatesPERMISSION_DENIED, whichcorresponds to the HTTP403status code.For additional Cloud Audit Logs query samples, seeBigQueryqueries.
Troubleshoot quotas or limits that can be increased
You can increase the following quotas and limits; however, it's best to firsttry any suggested workarounds or best practices.
Your project exceeded quota for free query bytes scanned
BigQuery returns this error when you run a query in the freeusage tier and the account reaches the monthly query limit. For more information about query pricing, seeFree usage tier.
Error message
Your project exceeded quota for free query bytes scanned
Resolution
To continue using BigQuery, you need toupgrade the account to apaid Cloud Billing account.
Streaming insert quota errors
This section provides tips for troubleshooting quota errors related tostreaming data into BigQuery.
In certain regions, streaming inserts have a higher quota if you don't populatetheinsertId field for each row. For more information about quotas forstreaming inserts, seeStreaming inserts.The quota-related errors for BigQuery streaming depend on thepresence or absence ofinsertId.
Error message
If theinsertId field is empty, the following quota error is possible:
| Quota limit | Error message |
|---|---|
| Bytes per second perproject | Your entity with gaia_id:GAIA_ID,project:PROJECT_ID in region:REGION exceeded quota for insert bytesper second. |
If theinsertId field is populated, the following quota errors are possible:
| Quota limit | Error message |
|---|---|
| Rows per second perproject | Your project:PROJECT_ID inREGION exceeded quota for streaminginsert rows per second. |
| Rows per second pertable | Your table:TABLE_ID exceeded quota forstreaming insert rows per second. |
| Bytes per second pertable | Your table:TABLE_ID exceeded quota forstreaming insert bytes per second. |
The purpose of theinsertId field is to deduplicate inserted rows. If multipleinserts with the sameinsertId arrive within a few minutes' window,BigQuery writes a single version of the record. However, thisautomatic deduplication is not guaranteed. For maximum streaming throughput, werecommend that you don't includeinsertId and instead usemanual deduplication.For more information, seeEnsuring data consistency.
When you encounter this error,diagnose the issuethe issue and thenfollow the recommended steps to resolve it.
Diagnosis
Use theSTREAMING_TIMELINE_BY_*views to analyze the streaming traffic. These views aggregate streamingstatistics over one-minute intervals, grouped byerror_code. Quota errors appearin the results witherror_code equal toRATE_LIMIT_EXCEEDED orQUOTA_EXCEEDED.
Depending on the specific quota limit that was reached, look attotal_rows ortotal_input_bytes. If the error is a table-level quota, filter bytable_id.
For example, the following query shows total bytes ingested per minute, and thetotal number of quota errors:
SELECTstart_timestamp,error_code,SUM(total_input_bytes)assum_input_bytes,SUM(IF(error_codeIN('QUOTA_EXCEEDED','RATE_LIMIT_EXCEEDED'),total_requests,0))ASquota_errorFROM`region-REGION_NAME`.INFORMATION_SCHEMA.STREAMING_TIMELINE_BY_PROJECTWHEREstart_timestamp>TIMESTAMP_SUB(CURRENT_TIMESTAMP,INTERVAL1DAY)GROUPBYstart_timestamp,error_codeORDERBY1DESC
Resolution
To resolve this quota error, do the following:
If you are using the
insertIdfield for deduplication, and your project isin a region that supports the higher streaming quota, we recommend removing theinsertIdfield. This solution might require some additional steps to manuallydeduplicate the data. For more information, seeManually removing duplicates.If you are not using
insertId, or if it's not feasible to remove it, monitoryour streaming traffic over a 24-hour period and analyze the quota errors:If you see mostly
RATE_LIMIT_EXCEEDEDerrors rather thanQUOTA_EXCEEDEDerrors, and your overall traffic is less than 80% of quota, the errors probablyindicate temporary spikes. You can address these errors by retrying theoperation using exponential backoff between retries.If you are using a Dataflow job to insert data, consider usingload jobs instead of streaminginserts. For more information, seeSetting the insertionmethod.If you are using Dataflow with a custom I/O connector, considerusing a built-in I/O connector instead. For more information, seeCustomI/O patterns.
If you see
QUOTA_EXCEEDEDerrors or the overall traffic consistentlyexceeds 80% of the quota, submit a request for a quota increase. For moreinformation, seeRequest a quota adjustment.You might also want to consider replacing streaming inserts with the newerStorage Write API, which has higher throughput,lower price, and many useful features.
For more information about streaming inserts, seestreaming data into BigQuery.
Maximum number of concurrent queries that contain remote functions
BigQuery returns this error when the number of concurrentqueries that contain remote functions exceeds the limit.
To learn more about remote functions limits, seeRemote functions.
Error message
Exceeded rate limits: too many concurrent queries with remote functions forthis project
This limit can be increased. Try the workarounds and best practices first.
Diagnosis
To see limits for concurrent queries that containremotefunctions, seeRemote function limits.
Resolution
- When using remote functions, adhere tobest practices for remotefunctions.
- You can request a quota increase by contactingsupport orsales. It might take several days to review andprocess the request. We recommend stating the priority, use case, and theproject ID in the request.
Maximum number ofCREATE MODEL statements
This error means that you have exceeded the quota forCREATE MODEL statements.
Error message
Quota exceeded: Your project exceeded quota for CREATE MODEL queries per project.
Resolution
If you exceed thequotaforCREATE MODEL statements, send an email tobqml-feedback@google.comand request a quota increase.
Maximum number of copy jobs per day per project quota errors
BigQuery returns this error when the number of copy jobs runningin a project has exceeded the daily limit.To learn more about the limit for copy jobs per day, seeCopy jobs.
Error message
Your project exceeded quota for copies per project
Diagnosis
If you'd like to gather more data about where the copy jobs are coming from,you can try the following:
If your copy jobs are located in a single or only a few regions, you can tryquerying the
INFORMATION_SCHEMA.JOBStable for specific regions. For example:SELECTcreation_time,job_id,user_email,destination_table.project_id,destination_table.dataset_id,destination_table.table_idFROM`PROJECT_ID`.`region-REGION_NAME`.INFORMATION_SCHEMA.JOBSWHEREcreation_timeBETWEENTIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL2DAY)ANDCURRENT_TIMESTAMP()ANDjob_type="COPY"orderbycreation_timeDESC
You can also adjust the time interval depending on the time range you're interested in.
To see all copy jobs in all regions, you can use the following filter inCloud Logging:
resource.type="bigquery_resource"protoPayload.methodName="jobservice.insert"protoPayload.serviceData.jobInsertRequest.resource.jobConfiguration.tableCopy:*
Resolution
- If the goal of the frequent copy operations is to create a snapshot of data,consider usingtable snapshotsinstead. Table snapshots are a cheaper and faster alternative to copying full tables.
- You can request a quota increase by contactingsupport orsales. It might take several days to review andprocess the request. We recommend stating the priority, use case, and theproject ID in the request.
Exceeded extract bytes per day quota error
BigQuery returns this error when the extraction exceeds thedefault 50 TiB daily limit in a project.For more information about extract job limits, seeExtract jobs.
Error message
Your usage exceeded quota for ExtractBytesPerDay
Diagnosis
If you are exporting a table that is larger than 50 TiB, the export fails becauseitexceeds the extraction limit.If you want toexport table datafor specific table partitions, you can use apartition decorator toidentify the partitions to export.
If you would like to gather usages of exports data over recent days, you cantry the following:
View the quotas for your projectwith filter criteria such as
Name: Extract bytes per dayorMetric: bigquery.googleapis.com/quota/extract/bytesalong with the Show usagechart to see your usage trend over a few days.Alternatively you can query
INFORMATION_SCHEMA.JOBS_BY_PROJECTto see your total extract bytes over a few days. For example, the followingquery returns the daily total bytes processed byEXTRACTjobs in the pastseven days.SELECTTIMESTAMP_TRUNC(creation_time,DAY)ASday,SUM(total_bytes_processed)/POW(1024,3)AStotal_gibibytes_processedFROM`region-REGION_NAME`.INFORMATION_SCHEMA.JOBS_BY_PROJECTWHEREcreation_timeBETWEENTIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)ANDCURRENT_TIMESTAMP()ANDjob_type="EXTRACT"GROUPBY1ORDERBY2DESC
You can then further refine the results by identifying the specific jobs thatare consuming more bytes than expected. The following example returns the top100
EXTRACTjobs which are consuming more than 100 GB processed over thepast seven days.SELECTcreation_time,job_id,total_bytes_processed/POW(1024,3)AStotal_gigabytes_processedFROM`region-REGION_NAME`.INFORMATION_SCHEMA.JOBS_BY_PROJECTWHEREcreation_timeBETWEENTIMESTAMP_SUB(CURRENT_TIMESTAMP(),INTERVAL7DAY)ANDCURRENT_TIMESTAMP()ANDjob_type="EXTRACT"ANDtotal_bytes_processed>(POW(1024,3)*100)ORDERBYtotal_bytes_processedDESCLIMIT100
You can alternatively use thejobs explorerwith filters likeBytes processed more than to filter for high processing jobsfor a specified period of time.
Resolution
One method of resolving this quota error is to create a slotreservation andassign yourproject into the reservation with thePIPELINE job type. This method canbypass the limit check since it uses your dedicated reservations rather thana free shared slot pool. If needed, the reservation can be deleted if you wantto use a shared slot pool later on.
For alternative approaches that allow exporting morethan 50 TiB,see the notes section inExtract jobs.
Maximumtabledata.list bytes per second per project quota errors
BigQuery returns this error when the project number mentionedin the error message reaches the maximum size of data that can be read throughthetabledata.list API call in a project per second. For more information, seeMaximumtabledata.list bytes per minute.
Error message
Your project:[project number] exceeded quota for tabledata.list bytes per second per project
Resolution
To resolve this error, do the following:
- In general, we recommend trying to stay under this limit. For example, byspacing out requests over a longer period with delays. If the error doesn'thappen frequently, implementing retries with exponential backoffsolves this issue.
- If the use case expects fast and frequent reading of a large amount of datafrom a table, we recommend usingBigQuery Storage Read APIinstead of the
tabledata.listAPI. If the preceding suggestions don't work, you can request a quota increase fromGoogle Cloud console API dashboard by doing the following:
- Go to theGoogle Cloud console API dashboard.
- In the dashboard, filter for Quota:
Tabledata list bytes per minute (default quota). - Select the quota and follow the instruction inRequest a quota adjustment.
It might take several days to review and process the request.
Maximum number of API requests limit errors
BigQuery returns this error when you reach the rate limit for thenumber of API requests to a BigQuery API per user per method—forexample, thetables.get methodcalls from a service account, or thejobs.insert method calls from a different user email.For more information, see theMaximum number of API requestsper second per user per method rate limit.
Error message
Quota exceeded: Your user_method exceeded quota for concurrent api requestsper user per method.
When you encounter this error,diagnose theissue and thenfollow the recommended stepsto resolve it.
Diagnosis
If you have not identified the method that has reached this rate limit, do thefollowing:
For service account
Go to the projectthat hosts the service account.
In the Google Cloud console, go to theAPI Dashboard.
For instructions on how to view the detailed usage information of an API,seeUsing the API Dashboard.
In the API Dashboard, selectBigQuery API.
To view more detailed usage information, selectMetrics, and then dothe following:
ForSelect Graphs, selectTraffic by API method.
Filter the chart by the service account's credentials. You might see spikes for a method in the time range where you noticed the error.
For API calls
Some API calls log errors in BigQuery auditlogs in Cloud Logging. To identify the method that reached the limit, do thefollowing:
In the Google Cloud console, go to the Google Cloud navigationmenu and thenselectLogging>Logs Explorer for your project:
Filter logs by running the following query:
resource.type="bigquery_resource" protoPayload.authenticationInfo.principalEmail="<user email or service account>" "Too many API requests per user per method for this user_method" In the log entry, you can find the method name under the property protoPayload.method_name.
For more information, seeBigQuery audit logsoverview.
Resolution
To resolve this quota error, do the following:
Reduce the number of API requests or add a delay between multiple API requestsso that the number of requests stays under this limit.
If the limit is only exceeded occasionally, you can implement retries on thisspecific error with exponential backoff.
If you frequently insert data, consider using theBigQuery Storage Write API. Streamingdata with this API isn't affected by the BigQuery API quota.
While loading data to BigQuery using Dataflowwith theBigQuery I/O connector, youmight encounter this error for the
tables.getmethod. To resolve this issue, do the following:Set the destination table's create disposition to
CREATE_NEVER. For moreinformation, seeCreate disposition.Use the Apache Beam SDK version 2.24.0 or higher. In theprevious versions of the SDK, the
CREATE_IF_NEEDEDdispositioncalls thetables.getmethod to check if the table exists.For additional quota, seeRequest a quota increase.Requesting a quota increase might take several days to process. To provide moreinformation for your request, we recommend thatyour request includes the priority of the job, the user running the query, andthe affected method.
Troubleshoot quotas or limits that can't be increased
You can't increase the following quotas or limits, but you can apply thesuggested workarounds or best practices to mitigate them.
Query queue limit errors
You might encounter this error if a project queues more interactive or batchqueries than itsqueue limitpermits.
Error message
Quota exceeded: Your project and region exceeded quota for max number of jobsthat can be queued per project.
Resolution
This limitcan't be increased. To resolve this error,see the following general guidance. For high-volume interactive queries, seeAvoid limits for high-volume interactive queries.
- Pause the job. If you identify a process or pipeline responsible for anincrease in queries, then pause that process or pipeline.
- Distribute run times. Distribute the load across a larger timeframe.If your reporting solution needs to run many queries, try to introduce somerandomness for when queries start. For example, don't start all reports at thesame time.
Optimize queries and the data model. Often, a query can be rewrittenso that it runs more efficiently.
For example, if your query contains aCommon table expression (CTE)–a
WITHclause–which is referenced in more than one place in the query, then thiscomputation is done multiple times. It is better to persist calculations doneby the CTE in a temporary table,and then reference it in the query.Multiple joins can also be the source of lack of efficiency. In this case, youmight want to consider usingnested and repeated columns. Using this often improves locality of thedata, eliminates the need for some joins, and overall reduces resourceconsumption and the query runtime.
Optimizing queries makes them cheaper, so when you use capacity-based pricing,you can run more queries with your slots. For more information, seeIntroduction to optimizing queryperformance.
Optimize the query model. BigQuery is not a relationaldatabase. It is not optimized for an infinite number of small queries. Running alarge number of small queries quickly depletes your quotas. Such queries don'trun as efficiently as they do with the smaller database products.BigQuery is a large data warehouse and this is its primary usecase. It performs best with analytical queries over large amounts of data.
Use the recommendations in the following list to optimize the query model toavoid reaching the query queue limit.
- Persist data (saved tables). Pre-process the data inBigQuery and store that data in additional tables. Forexample, if you execute many similar, computationally intensive queriesusing different
WHEREconditions, then their results are not cached. Suchqueries also consume resources each time they run. You can improve theperformance of such queries and decrease their processing time bypre-computing the data and storing it in a table. This pre-computed data inthe table can be queried bySELECTqueries. It can often be done duringingestion within the ETL process, or by usingscheduled queriesormaterialized views. - Use dry run mode. Run queries indry run mode,which estimates the number of bytes read but does not actually process thequery.
- Preview table data. To experiment with or explore data rather thanrunning queries, preview table data with thetable preview capability in BigQuery.
- Usecached query results.All query results, including both interactive and batch queries, are cachedin temporary tables for approximately 24 hours with someexceptions.While running a cached query does still count against your concurrent querylimit, queries that use cached results are significantly faster than queriesthat don't use cached results because BigQuery does not needto compute the result set.
- Use BigQuery BI Engine. If you have encountered this error while using abusiness intelligence (BI) toolto create dashboards that query data in BigQuery, then werecommend that you useBigQuery BI Engine.Using BigQuery BI Engine is optimal for this use case.
- Persist data (saved tables). Pre-process the data inBigQuery and store that data in additional tables. Forexample, if you execute many similar, computationally intensive queriesusing different
Use jobs with batch priority. You can queue morebatch queries than interactivequeries.
Distribute queries. Organize and distribute the load across differentprojects as informed by the nature of your queries and your business needs.
Increase slots in your reservation.Increase slots or, if you have ahigh-demand workload,switchto reservations (capacity-based model) from on-demand (pay-per query model).
Avoid limits for high-volume interactive queries
Running high-volume interactive queries can cause those queries to reachthelimit for the maximum number of queuedinteractive queries. This limit can't be increased.
If you run high-volume interactive queries, especially in scenariosthat involve automated triggers like Cloud Run functions, firstmonitorthe behavior of, and stop, the Cloud Run function causing theerror, and then use one of the following recommended strategies for avoidingthis limit:
- Increase slots in your reservation. If you have a high-demand workload,switch to reservations (capacity-based model) from on-demand (pay per query model).
- Spread the workload across interactive queries.
- Because you can queue more batch queries than interactive queries, usebatch priority jobs instead of interactive queries.
Shuffle size limit errors
BigQuery returns this error when your project exceeds the maximumdisk and memory size limit available for shuffle operations.
This quota is computed per-reservation and sliced across projects for thereservations. The quota cannot be modified by Cloud Customer Care. You can learnmore about your usage by querying theINFORMATION_SCHEMA.JOBS_TIMELINE view.
Error message
You receive one of the following error messages:
Quota exceeded: Your project exceeded quota for total shuffle size limit.
Resources exceeded: Your project or organization exceeded the maximumdisk and memory limit available for shuffle operations. Consider provisioningmore slots, reducing query concurrency, or using more efficient logic in thisjob.
Resolution
To resolve this error, do the following:
- Increase your reservation.
- Optimize queries.
- Reduce concurrency of queries or materializing intermediate results to reducedependence on resources. For more information, seeQuery queues andCreate materialized views.
- Review detailed stepsfor mitigating insufficient shuffle quota.
Number of partition modifications for column-partitioned tables quota errors
BigQuery returns this error when your column-partitioned tablereaches thequota of the number of partition modifications permitted per day.Partition modifications include the total of allload jobs,copy jobs, andquery jobsthat append or overwrite a destination partition.
To see the value of theNumber of partitionmodifications per column-partitioned table per day limit, seePartitionedtables.
Error message
Quota exceeded: Your table exceeded quota forNumber of partition modifications to a column partitioned table
Resolution
This quota cannot be increased. To resolve this quota error, do the following:
- Change the partitioning on the table to have more data in each partition, inorder to decrease the total number of partitions. For example, change frompartitioning by day to partitioning by monthor changehow you partition the table.
- Useclusteringinstead of partitioning.
- If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example,
gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example,gs://my_path/*).For more information, seeBatch loading data.
- If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn't perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
- To append data at a high rate, consider usingBigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, seeStorage Write API and to see costs of using this API, seeBigQuery data ingestion pricing.
- To monitor the number of modified partitions on a table, use the
INFORMATION_SCHEMAview. - For information about optimizing table load jobs to avoid reaching quota limits, seeOptimize load jobs.
Maximum rate of table metadata update operations limit errors
BigQuery returns this error when your table reaches the limit formaximum rate of table metadata update operations per table for standard tables.Table operations include the combined total of allload jobs,copy jobs, andquery jobsthat append to or overwrite a destination table or that useaDMLDELETE,INSERT,MERGE,TRUNCATE TABLE, orUPDATE to write data to a table.
To see the value of theMaximum rate of table metadata updateoperations per table limit, seeStandard tables.
Error message
Exceeded rate limits: too many table update operations for this table
When you encounter this error,diagnose theissue, and thenfollow the recommended stepsto resolve it.
Diagnosis
Metadata table updates can originate fromAPI calls that modify a table'smetadata or fromjobs that modify a table's content. If you have notidentified the source from where most update operations to a table's metadataare originating, do the following:
Identify API calls
Go to the Google Cloud navigation menu, and thenselectLogging> Logs Explorer:
Filter logs to view table operations by running the following query:
resource.type="bigquery_dataset"protoPayload.resourceName="projects/my-project-id/datasets/my_dataset/tables/my_table"(protoPayload.methodName="google.cloud.bigquery.v2.TableService.PatchTable" ORprotoPayload.methodName="google.cloud.bigquery.v2.TableService.UpdateTable" ORprotoPayload.methodName="google.cloud.bigquery.v2.TableService.InsertTable")
Identify jobs
The following query returns a list of jobs that modify the affected table inthe project within the past day. If you expect multiple projects in an organizationto write to the table, replaceJOBS_BY_PROJECT withJOBS_BY_ORGANIZATION.
SELECTjob_id,user_email,queryFROM`region-REGION_NAME`.INFORMATION_SCHEMA.JOBS_BY_PROJECTWHEREcreation_time>TIMESTAMP_SUB(CURRENT_TIMESTAMP,INTERVAL1DAY)ANDdestination_table.project_id="my-project-id"ANDdestination_table.dataset_id="my_dataset"ANDdestination_table.table_id="my_table"
For more information, seeBigQuery audit logsoverview.
Resolution
This quota cannot be increased. To resolve this quota error, do the following:
- Reduce the update rate for the table metadata.
- Add a delay between jobs or table operations to make sure that the update rateis within the limit.
For data inserts or modification, consider using DML operations. DMLoperations are not affected by theMaximum rate of table metadata updateoperations per table rate limit.
DML operations have otherlimitsand quotas. For more information, seeUsingdata manipulation language (DML).
- If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example,
gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example,gs://my_path/*).For more information, seeBatch loading data.
- If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn't perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
- To append data at a high rate, consider usingBigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, seeStorage Write API and to see costs of using this API, seeBigQuery data ingestion pricing.
- To monitor the number of modified partitions on a table, use the
INFORMATION_SCHEMAview. - For information about optimizing table load jobs to avoid reaching quota limits, seeOptimize load jobs.
Table imports or query appends quota errors
BigQuery returns this error message when your table reaches thelimit fortable operations per day for standard tables. Table operations include thecombined total of allload jobs,copy jobs, andquery jobsthat append or overwrite a destination table.
To see the value of theTable operations per day limit, seeStandardtables.
Error message
Your table exceeded quota for imports or query appends per table
When you encounter this error,diagnose theissue, and thenfollow the recommended stepsto resolve it.
Diagnosis
If you have not identified the source from where most table operations areoriginating, do the following:
Make a note of the project, dataset, and table that the failed query, load,or the copy job is writing to.
Use
INFORMATION_SCHEMA.JOBS_BY_*tables to learn more about jobsthat modify the table.The following example finds the hourly count of jobs grouped by job type forthe last 24-hour period using
JOBS_BY_PROJECT. If you expect multipleprojects to write to the table, replaceJOBS_BY_PROJECTwithJOBS_BY_ORGANIZATION.SELECTTIMESTAMP_TRUNC(creation_time,HOUR),job_type,count(1)FROM`region-REGION_NAME`.INFORMATION_SCHEMA.JOBS_BY_PROJECTWHEREcreation_time>TIMESTAMP_SUB(CURRENT_TIMESTAMP,INTERVAL1DAY)ANDdestination_table.project_id="my-project-id"ANDdestination_table.dataset_id="my_dataset"ANDdestination_table.table_id="my_table"GROUPBY1,2ORDERBY1DESC
Resolution
This quota cannot be increased. To resolve this quota error, do the following:
- If you frequently load data from multiple small files stored in Cloud Storage that uses a job per file, then combine multiple load jobs into a single job. You can load from multiple Cloud Storage URIs with a comma-separated list (for example,
gs://my_path/file_1,gs://my_path/file_2), or by using wildcards (for example,gs://my_path/*).For more information, seeBatch loading data.
- If you use load, select or copy jobs to append single rows of data to a table, for example, then you should consider batching multiple jobs into one job. BigQuery doesn't perform well when used as a relational database. As a best practice, avoid running frequent, single-row append actions.
- To append data at a high rate, consider usingBigQuery Storage Write API. It is a recommended solution for high-performance data ingestion. The BigQuery Storage Write API has robust features, including exactly-once delivery semantics. To learn about limits and quotas, seeStorage Write API and to see costs of using this API, seeBigQuery data ingestion pricing.
- To monitor the number of modified partitions on a table, use the
INFORMATION_SCHEMAview. - For information about optimizing table load jobs to avoid reaching quota limits, seeOptimize load jobs.
Quota exceeded for quota metric 'Authorized Views per dataset' and limit '2500'.
If a dataset's access control list exceeds the combined limit for authorizedresources, BigQuery returns this error message.
Error message
Quota exceeded for quota metric 'Authorized Views per dataset' and limit '2500'.
Resolution
A dataset's access control list can have up to 2,500 total authorized resources, includingauthorized views,authorized datasets, andauthorized functions. If you exceed this limit due to a large number of authorized views, consider grouping the views into authorized datasets. As a best practice, group related views into authorized datasets when you design new BigQuery architectures, especially multi-tenant architectures.
Too many DML statements outstanding against table
This error means that the number ofconcurrent mutating DML statements(UPDATE,DELETE,MERGE) running against the same table has exceededthedata manipulation language (DML) quota limit.This quota limit is per table,and applies to mutating DML statements only, which does not includeINSERT.
Resolution
Batch the DML jobs by followingBest practices for DML statements.
Loading CSV files quota errors
If you load a large CSV file using thebq load command with the--allow_quoted_newlines flag,you might encounter this error.
Error message
Input CSV files are not splittable and at least one of the files is larger thanthe maximum allowed size. Size is: ...Resolution
To resolve this quota error, do the following:
- Set the
--allow_quoted_newlinesflag tofalse. - Split the CSV file into smaller chunks that are each less than 4 GB.
For more information about limits that apply when you load data intoBigQuery, seeLoad jobs.
Your user exceeded quota for concurrentproject.lists requests
This error occurs when Microsoft Power BI jobs that communicate withBigQuery through a Simba ODBC driver or DataHub fail because theyexceed theproject.list API limit. To resolve this issue, use the short-termor long-term workarounds described in this section.
Error message
Your user exceeded quota for concurrent project.lists requestsDiagnosis
This error occurs during the connection and discovery phase for Power BI whena Power BI report refreshes and the Simba driver establishes a connection to aspecific BigQuery project.
Short-term resolution
To address this issue in the short term, use the following workarounds, whichare ordered from most to least effective. Implement fixes three or four,depending on whether you connect to BigQuery using the Simbadriver or using DataHub.
For long-term fixes, seeLong-term resolution.
Stagger report refreshes. If you can't modify the DSN, mitigate the quotaissue by reducing the number of simultaneous requests. Instead of having allreports refresh simultaneously (for example, at9:00 AM), stagger their schedules by a few minutes (for example, at 9:01 AM,9:03 AM, and 9:05 AM). This practice spreads out API calls over time, makingit less likely that you reach the concurrent limit.
Implement retries in Power BI. This reactive strategy helps a report recoverfrom a temporary failure. Power BI has built-in retry logic for data refreshfailures. While this practice doesn't prevent the quota error, it makes yourpipeline more resilient by allowing a report to succeed on a subsequentattempt after the initial spike in API calls subsides. To implement thisfix, do the following:
- In the Power BI service, go toSettings for your dataset.
- Expand theScheduled refresh section. UnderRetry, configure Power BI to automatically rerun a failed refresh.
For earlier versions of the Simba driver, specify the project ID in the ODBCconnection. This action prevents the driver from performing the
projects.listdiscovery call. Instead, the driver connects directly to thespecified project, which prevents unnecessary API calls and resolves thequota issue.Newer drivers fail immediately if the project isn't specified with a messagesimilar to
Unable to establish connection with data source. Missing settings: {[Catalog]}.To make this fix, do the following:
- On the machine running the Power BI Gateway or Power BI Desktop, opentheODBC Data Sources (64-bit) application.
- On the main setup screen for the Simba ODBC driver forBigQuery, fill theCatalog (Project) field with yourspecific Google Cloud project ID—for example,
my-gcp-project-id.
For earlier versions of DataHub, specify the project ID in theDataHub ingestion configuration. Make this fix if you are using DataHubinstead of the Simba driver. Similar to Simba, later versions of DataHubrequire you to specify the project ID or they won't connect toBigQuery.
To avoid exceeding DataHub limits, modify your DataHub ingestion configurationto provide an explicit list of project IDs to scan. This prevents the DataHubconfiguration from finding all projects that the service account can see.
In your BigQuery source recipe file (typically a YAMLfile), use the
project_idsconfiguration to enumerate the projects that youwant to ingest. Then, redeploy the DataHub ingestion recipe with the newconfiguration. See the following example andthis longer exampleprovided by DataHub.The following is an example of a DataHub configuration snippet:
source:type:"bigquery"config:# Instead of relying on discovery, explicitly list the projects.# This avoids the problematic projects.list() API call.project_ids:-"YOUR_PRODUCTION_PROJECT_ID"-"YOUR_ANALYTICS_PROJECT_ID"-"ANOTHER_BQ_PROJECT"
Long-term resolution
The best long-term fix for this error message is to create separate, dedicatedGoogle Cloud service accounts for each function. For example, create aservice account for all Power BI reports and a service account for DataHubingestion.
This best practice isolates API usage into separate quota buckets andprevents a high-load job in DataHub from causing critical business reports inPower BI to fail.
Use the action plan in the following sections to resolve long-term quota errorsacross Power BI and DataHub.
Phase 1: Preparation
- Inform owners of the Power BI gateways and DataHub configuration that youwill make coordinated changes to resolve ongoing job failures.
- In the Google Cloud console,createtwo new service accounts—for example,
sa-powerbi-gateway@...andsa-datahub-ingestion@.... - Create service account keys for the Power BIand DataHub service accounts.
- Grant each new service accountleast-privilege permissions by assigning the following Identity and Access Management rolesthat enable it to perform its tasks in relevant Identity and Access Management (IAM). Avoid assigningoverly broad roles—for example,ProjectEditor.
Required roles
The service account for Power BI runs queries and reads data from tables. Grantthe following roles to service accounts in each Google Cloud project thatcontains data that Power BI must access. To learn more about these roles, seeBigQuery roles.
- BigQuery Data Viewer:provides read-only access to datasets, tables, and views.
- BigQuery Job User:provides permissions to run jobs,including queries, which is essential for Power BI to execute its requests.
The service account for DataHub ingestion only needs to read metadata, such astable names, schemas, and descriptions, not the data within the tables. Grantthe following role at the project level for each project that DataHub scans.To learn more about these roles, seeIAM roles for BigQuery.
BigQuery Metadata Viewer: this role is designed specifically to read metadata. It grants permissions tolist datasets and tables and view their metadata without granting access to theunderlying data.
Phase 2: Coordinated rollout
During a low-usage period, the Power BI administrator updates the ODBC DSNconfigurations on the gateway machines by performing the following steps:
- Changes the authentication method to use the new
sa-powerbi-gateway@...service account key created in a previous step. - If not already performed as a short-term fix, enters the Google Cloudproject ID in theCatalog (Project) field of the ODBC driver.
- Has the DataHub owner update the BigQuery sourceconfiguration YAML file.
- Points to the new
sa-datahub-ingestion@...service account key createdin a previous step. - If not already performed as a short-term fix, uses the
project_idsparameter to explicitly list the projects to be scanned. - Redeploys the DataHub ingestion recipe with the new configuration.
Phase 3: Verification and monitoring
To verify and monitor effects of the fixes, the Power BI and DataHubadministrators perform the following steps:
- Manually trigger a refresh for a few key Power BI reports and a newingestion run in DataHub. Confirm that these jobs complete successfullywithout incurring quota errors.
- In the Google Cloud console, navigate to theIAM & Admin>Quotas page.
- Filter for theBigQuery API service.
- Find the quota namedConcurrent
project.listsrequests and click thegraph icon to view usage over time.
Administrators should see a dramatic and permanent drop in the usage of thisspecific API call, confirming that the fix was successful.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-16 UTC.