Work with remote functions
A BigQuery remote function allows you to implement your functionin other languages than SQL and Javascript or with the libraries or serviceswhich are not allowed in BigQueryuser-defined functions.
Overview
A BigQuery remote function lets you incorporateGoogleSQL functionality with software outside ofBigQuery by providing a direct integration withCloud Run functions andCloud Run. WithBigQuery remote functions, you can deploy your functions inCloud Run functions or Cloud Run implemented with anysupported language, and then invoke them from GoogleSQLqueries.
Workflow
- Create the HTTP endpoint in Cloud Run functions or Cloud Run.
- Create a remote function in BigQuery.
- Create a connection of type
CLOUD_RESOURCE. - Create a remote function.
- Create a connection of type
- Use the remote function in a query just like any other user-defined functions.
Limitations
Remote functions only support one of the followingdata types as argumenttype or return type:
- Boolean
- Bytes
- Numeric
- String
- Date
- Datetime
- Time
- Timestamp
- JSON
Remote functions do not support
ARRAY,STRUCT,INTERVAL, orGEOGRAPHYtypes.You cannot create table-valued remote functions.
You cannot use remote functions when creating materialized views.
The return value of a remote function is always assumed to benon-deterministic so the result of a query calling a remote function is notcached.
You might see repeated requests with the same data to your endpoint,even after successful responses, due to transient network errors orBigQuery internal errors.
When a remote function evaluation is skipped for some rows due toshort-circuiting, for example, inconditional expressionsor a
MERGEstatementwithWHEN [NOT] MATCHED, batching is not used with the remote function.In this case, thecallsfield in theHTTP request bodyhas exactly one element.If the dataset associated with the remote function is replicated to adestination region throughcross-region datasetreplication, the remote function can onlybe queried in the region that it was created in.
Create an endpoint
To create a remote function that can implement business logic, you must createan HTTP endpoint by using either Cloud Run functions orCloud Run. The endpoint must be able to process a batch of rowsin a single HTTP POST request and return the results for the batch as anHTTP response.
If you are creating the remote function byusingBigQuery DataFrames,you don't have to manually create the HTTP endpoint; the service does that foryou automatically.
See theCloud Run functions tutorial and otherCloud Run functions documentation on how towrite, deploy, test and maintain a Cloud Run function.
See theCloud Run quick startand otherCloud Run documentation on how towrite, deploy, test and maintain a Cloud Run service.
It's recommended that you keep the default authentication instead of allowingunauthenticated invocation of your Cloud Run function orCloud Run service.
Input format
BigQuery sends HTTP POST requests with JSON body in the followingformat:
| Field name | Description | Field type |
|---|---|---|
| requestId | Id of the request. Unique over multiple requests sent to this endpoint in a GoogleSQL query. | Always provided. String. |
| caller | Job full resource name for the GoogleSQL query calling the remote function. | Always provided. String. |
| sessionUser | Email of the user executing the GoogleSQL query. | Always provided. String. |
| userDefinedContext | The user defined context that was used when creating the remote function in BigQuery. | Optional. A JSON object with key-value pairs. |
| calls | A batch of input data. | Always provided. A JSON array. Each element itself is a JSON array, which is a JSON encoded argument list of one remote function call. |
An example of a request:
{"requestId":"124ab1c","caller":"//bigquery.googleapis.com/projects/myproject/jobs/myproject:US.bquxjob_5b4c112c_17961fafeaf","sessionUser":"test-user@test-company.com","userDefinedContext":{"key1":"value1","key2":"v2"},"calls":[ [null, 1, "", "abc"],["abc", "9007199254740993", null, null]]}Output format
BigQuery expects the endpoint should return a HTTP response inthe following format, otherwise BigQuery can't consume it andwill fail the query calling the remote function.
| Field name | Description | Value Range |
| replies | A batch of return values. | Required for a successful response. A JSON array. Each element corresponds to a JSON encoded return value of the external function. Size of the array must match the size of the JSON array of |
| errorMessage | Error message when the HTTP response code other than 200 is returned. For non-retryable errors, we return this as part of the BigQuery job's error message to the user. | Optional. String. Size should be less than 1KB. |
An example of a successful response:
{ "replies": [ 1, 0 ]}An example of a failed response:
{ "errorMessage": "Received but not expected that the argument 0 be null".}HTTP response code
Your endpoint should return the HTTP response code 200 for a successful response.When BigQuery receives any other value, BigQueryconsiders the response as a failure, and retries when the HTTP response code is408, 429, 500, 503 or 504 until some internal limit.
JSON encoding of SQL data type
JSON encoding in HTTP request/response followsthe existing BigQuery JSON encodingforTO_JSON_STRING function.
Sample Cloud Run function code
The following sample Python code implements adding all the integer arguments ofthe remote function. It handles a request with the arguments for batchedinvocations and returns all the result in a response.
importfunctions_frameworkfromflaskimportjsonify# Max INT64 value encoded as a number in JSON by TO_JSON_STRING. Larger values are encoded as# strings.# See https://cloud.google.com/bigquery/docs/reference/standard-sql/json_functions#json_encodings_MAX_LOSSLESS=9007199254740992@functions_framework.httpdefbatch_add(request):try:return_value=[]request_json=request.get_json()calls=request_json['calls']forcallincalls:return_value.append(sum([int(x)ifisinstance(x,str)elsexforxincallifxisnotNone]))replies=[str(x)ifx >_MAX_LOSSLESSorx <-_MAX_LOSSLESSelsexforxinreturn_value]return_json=jsonify({"replies":replies})returnreturn_jsonexceptExceptionase:returnjsonify({"errorMessage":str(e)}),400Assuming that the function is deployed in the projectmy_gcf_project in regionus-east1 as the function nameremote_add, it can be accessed via theendpointhttps://us-east1-my_gcf_project.cloudfunctions.net/remote_add.
Sample Cloud Run code
The following sample Python code implements a web service, which can be builtand deployed to Cloud Run for the same functionality.
importosfromflaskimportFlask,request,jsonify# Max INT64 value encoded as a number in JSON by TO_JSON_STRING. Larger values are encoded as# strings.# See https://cloud.google.com/bigquery/docs/reference/standard-sql/json_functions#json_encodings_MAX_LOSSLESS=9007199254740992app=Flask(__name__)@app.route("/",methods=['POST'])defbatch_add():try:return_value=[]request_json=request.get_json()calls=request_json['calls']forcallincalls:return_value.append(sum([int(x)ifisinstance(x,str)elsexforxincallifxisnotNone]))replies=[str(x)ifx >_MAX_LOSSLESSorx <-_MAX_LOSSLESSelsexforxinreturn_value]returnjsonify({"replies":replies})exceptExceptionase:returnjsonify({"errorMessage":str(e)}),400if__name__=="__main__":app.run(debug=True,host="0.0.0.0",port=int(os.environ.get("PORT",8080)))See theguide onhow to build and deploy the code.
Assuming that the Cloud Run service is deployed in the projectmy_gcf_project in regionus-east1 as the service nameremote_add, it canbe accessed via the endpointhttps://remote_add-<project_id_hash>-ue.a.run.app.
Create a remote function
BigQuery uses aCLOUD_RESOURCE connection to interact with yourCloud Run function. In order to create a remote function, you mustcreate aCLOUD_RESOURCE connection. If you are creating the remote function byusingBigQuery DataFramesand you have been granted theProject IAM Admin (roles/resourcemanager.projectIamAdmin) role, then youdon't have to manually create the connection and grant it access; the servicedoes that for you automatically.
Create a connection
You must have a Cloud resource connection to connect to Cloud Run functionand Cloud Run.
You can skip this step if you either have a default connection configured, oryou have the BigQuery Admin role.
Select one of the following options:Console
Go to theBigQuery page.
In the left pane, clickExplorer:

If you don't see the left pane, clickExpand left pane to open the pane.
In theExplorer pane, expand your project name, and then clickConnections.
On theConnections page, clickCreate connection.
ForConnection type, chooseVertex AI remote models, remotefunctions, BigLake and Spanner (Cloud Resource).
In theConnection ID field, enter a name for your connection.
ForLocation type, select a location for your connection. Theconnection should be colocated with your other resources such asdatasets.
ClickCreate connection.
ClickGo to connection.
In theConnection info pane, copy the service account ID for use ina later step.
bq
In a command-line environment, create a connection:
bqmk--connection--location=REGION--project_id=PROJECT_ID\--connection_type=CLOUD_RESOURCECONNECTION_ID
The
--project_idparameter overrides the default project.Replace the following:
REGION: yourconnection regionPROJECT_ID: your Google Cloud project IDCONNECTION_ID: an ID for yourconnection
When you create a connection resource, BigQuery creates aunique system service account and associates it with the connection.
Troubleshooting: If you get the following connection error,update the Google Cloud SDK:
Flags parsing error: flag --connection_type=CLOUD_RESOURCE: value should be one of...
Retrieve and copy the service account ID for use in a laterstep:
bqshow--connectionPROJECT_ID.REGION.CONNECTION_ID
The output is similar to the following:
name properties1234.REGION.CONNECTION_ID {"serviceAccountId": "connection-1234-9u56h9@gcp-sa-bigquery-condel.iam.gserviceaccount.com"}
Python
Before trying this sample, follow thePython setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryPython API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
importgoogle.api_core.exceptionsfromgoogle.cloudimportbigquery_connection_v1client=bigquery_connection_v1.ConnectionServiceClient()defcreate_connection(project_id:str,location:str,connection_id:str,):"""Creates a BigQuery connection to a Cloud Resource. Cloud Resource connection creates a service account which can then be granted access to other Google Cloud resources for federated queries. Args: project_id: The Google Cloud project ID. location: The location of the connection (for example, "us-central1"). connection_id: The ID of the connection to create. """parent=client.common_location_path(project_id,location)connection=bigquery_connection_v1.Connection(friendly_name="Example Connection",description="A sample connection for a Cloud Resource.",cloud_resource=bigquery_connection_v1.CloudResourceProperties(),)try:created_connection=client.create_connection(parent=parent,connection_id=connection_id,connection=connection)print(f"Successfully created connection:{created_connection.name}")print(f"Friendly name:{created_connection.friendly_name}")print(f"Service Account:{created_connection.cloud_resource.service_account_id}")exceptgoogle.api_core.exceptions.AlreadyExists:print(f"Connection with ID '{connection_id}' already exists.")print("Please use a different connection ID.")exceptExceptionase:print(f"An unexpected error occurred while creating the connection:{e}")Node.js
Before trying this sample, follow theNode.js setup instructions in theBigQuery quickstart using client libraries. For more information, see theBigQueryNode.js API reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up authentication for client libraries.
const{ConnectionServiceClient}=require('@google-cloud/bigquery-connection').v1;const{status}=require('@grpc/grpc-js');constclient=newConnectionServiceClient();/** * Creates a new BigQuery connection to a Cloud Resource. * * A Cloud Resource connection creates a service account that can be granted access * to other Google Cloud resources. * * @param {string} projectId The Google Cloud project ID. for example, 'example-project-id' * @param {string} location The location of the project to create the connection in. for example, 'us-central1' * @param {string} connectionId The ID of the connection to create. for example, 'example-connection-id' */asyncfunctioncreateConnection(projectId,location,connectionId){constparent=client.locationPath(projectId,location);constconnection={friendlyName:'Example Connection',description:'A sample connection for a Cloud Resource',// The service account for this cloudResource will be created by the API.// Its ID will be available in the response.cloudResource:{},};constrequest={parent,connectionId,connection,};try{const[response]=awaitclient.createConnection(request);console.log(`Successfully created connection:${response.name}`);console.log(`Friendly name:${response.friendlyName}`);console.log(`Service Account:${response.cloudResource.serviceAccountId}`);}catch(err){if(err.code===status.ALREADY_EXISTS){console.log(`Connection '${connectionId}' already exists.`);}else{console.error(`Error creating connection:${err.message}`);}}}Terraform
Use thegoogle_bigquery_connectionresource.
To authenticate to BigQuery, set up Application DefaultCredentials. For more information, seeSet up authentication for client libraries.
The following example creates a Cloud resource connection namedmy_cloud_resource_connection in theUS region:
# This queries the provider for project information.data "google_project" "default" {}# This creates a cloud resource connection in the US region named my_cloud_resource_connection.# Note: The cloud resource nested object has only one output field - serviceAccountId.resource "google_bigquery_connection" "default" { connection_id = "my_cloud_resource_connection" project = data.google_project.default.project_id location = "US" cloud_resource {}}To apply your Terraform configuration in a Google Cloud project, complete the steps in the following sections.
Prepare Cloud Shell
- LaunchCloud Shell.
Set the default Google Cloud project where you want to apply your Terraform configurations.
You only need to run this command once per project, and you can run it in any directory.
export GOOGLE_CLOUD_PROJECT=PROJECT_ID
Environment variables are overridden if you set explicit values in the Terraform configuration file.
Prepare the directory
Each Terraform configuration file must have its own directory (alsocalled aroot module).
- InCloud Shell, create a directory and a new file within that directory. The filename must have the
.tfextension—for examplemain.tf. In this tutorial, the file is referred to asmain.tf.mkdirDIRECTORY && cdDIRECTORY && touch main.tf
If you are following a tutorial, you can copy the sample code in each section or step.
Copy the sample code into the newly created
main.tf.Optionally, copy the code from GitHub. This is recommended when the Terraform snippet is part of an end-to-end solution.
- Review and modify the sample parameters to apply to your environment.
- Save your changes.
- Initialize Terraform. You only need to do this once per directory.
terraform init
Optionally, to use the latest Google provider version, include the
-upgradeoption:terraform init -upgrade
Apply the changes
- Review the configuration and verify that the resources that Terraform is going to create or update match your expectations:
terraform plan
Make corrections to the configuration as necessary.
- Apply the Terraform configuration by running the following command and entering
yesat the prompt:terraform apply
Wait until Terraform displays the "Apply complete!" message.
- Open your Google Cloud project to view the results. In the Google Cloud console, navigate to your resources in the UI to make sure that Terraform has created or updated them.
Set up access
You must give the new connection read-only access to yourCloud Run function or Cloud Run service.It is not recommended to allow unauthenticated invocation for yourCloud Run function or Cloud Run service.
To grant roles, follow these steps:
Go to theIAM & Admin page.
ClickAdd.
TheAdd principals dialog opens.
In theNew principals field, enter the service account ID that youcopied earlier.
In theSelect a role field, select one of the following options:
- If you are using a 1st-gen Cloud Run function, chooseCloud Function, and then selectCloud Function Invoker role.
- If you are using a 2nd-gen Cloud Run function, chooseCloud Run, and then selectCloud Run Invoker role.
- If you are using a Cloud Run service, chooseCloud Run, and then selectCloud Run Invoker role.
ClickSave.
Create a remote function
To create a remote function:
SQL
Run the followingCREATE FUNCTIONstatement in BigQuery:
In the Google Cloud console, go to theBigQuery page.
In the query editor, enter the following statement:
CREATEFUNCTION
PROJECT_ID.DATASET_ID.remote_add(xINT64,yINT64)RETURNSINT64REMOTEWITHCONNECTIONPROJECT_ID.LOCATION.CONNECTION_NAMEOPTIONS(endpoint='ENDPOINT_URL')Replace the following:
DATASET_ID: the ID of your BigQuery dataset.ENDPOINT_URL: the URL of your Cloud Run function or Cloud Run remote function endpoint.
ClickRun.
For more information about how to run queries, seeRun an interactive query.
BigQuery DataFrames
- Enable the required APIs and make sure you have been granted therequired roles, as described in theRequirements section ofRemote functions.
Use the
remote_functiondecorator:importbigframes.pandasasbpd# Set BigQuery DataFrames optionsbpd.options.bigquery.project=your_gcp_project_idbpd.options.bigquery.location="US"# BigQuery DataFrames gives you the ability to turn your custom scalar# functions into a BigQuery remote function. It requires the GCP project to# be set up appropriately and the user having sufficient privileges to use# them. One can find more details about the usage and the requirements via# `help` command.help(bpd.remote_function)# Read a table and inspect the column of interest.df=bpd.read_gbq("bigquery-public-data.ml_datasets.penguins")df["body_mass_g"].head(10)# Define a custom function, and specify the intent to turn it into a remote# function. It requires a BigQuery connection. If the connection is not# already created, BigQuery DataFrames will attempt to create one assuming# the necessary APIs and IAM permissions are setup in the project. In our# examples we will be letting the default connection `bigframes-default-connection`# be used. We will also set `reuse=False` to make sure we don't# step over someone else creating remote function in the same project from# the exact same source code at the same time. Let's try a `pandas`-like use# case in which we want to apply a user defined scalar function to every# value in a `Series`, more specifically bucketize the `body_mass_g` value# of the penguins, which is a real number, into a category, which is a# string.@bpd.remote_function(reuse=False,cloud_function_service_account="default",)defget_bucket(num:float)->str:ifnotnum:return"NA"boundary=4000return"at_or_above_4000"ifnum >=boundaryelse"below_4000"# Then we can apply the remote function on the `Series` of interest via# `apply` API and store the result in a new column in the DataFrame.df=df.assign(body_mass_bucket=df["body_mass_g"].apply(get_bucket))# This will add a new column `body_mass_bucket` in the DataFrame. You can# preview the original value and the bucketized value side by side.df[["body_mass_g","body_mass_bucket"]].head(10)# The above operation was possible by doing all the computation on the# cloud. For that, there is a google cloud function deployed by serializing# the user code, and a BigQuery remote function created to call the cloud# function via the latter's http endpoint on the data in the DataFrame.# The BigQuery remote function created to support the BigQuery DataFrames# remote function can be located via a property `bigframes_remote_function`# set in the remote function object.print(f"Created BQ remote function:{get_bucket.bigframes_remote_function}")# The cloud function can be located via another property# `bigframes_cloud_function` set in the remote function object.print(f"Created cloud function:{get_bucket.bigframes_cloud_function}")# Warning: The deployed cloud function may be visible to other users with# sufficient privilege in the project, so the user should be careful about# having any sensitive data in the code that will be deployed as a remote# function.# Let's continue trying other potential use cases of remote functions. Let's# say we consider the `species`, `island` and `sex` of the penguins# sensitive information and want to redact that by replacing with their hash# code instead. Let's define another scalar custom function and decorate it# as a remote function. The custom function in this example has external# package dependency, which can be specified via `packages` parameter.@bpd.remote_function(reuse=False,packages=["cryptography"],cloud_function_service_account="default",)defget_hash(input:str)->str:fromcryptography.fernetimportFernet# handle missing valueifinputisNone:input=""key=Fernet.generate_key()f=Fernet(key)returnf.encrypt(input.encode()).decode()# We can use this remote function in another `pandas`-like API `map` that# can be applied on a DataFramedf_redacted=df[["species","island","sex"]].map(get_hash)df_redacted.head(10)
You need to have the permissionbigquery.routines.create on the dataset whereyou create the remote function, and thebigquery.connections.delegatepermission (available from the BigQuery Connection Admin role) on the connectionthat is used by the remote function.
Providing user defined context
You can specifyuser_defined_context inOPTIONS as a form of key-valuepairs, which will be part of every HTTP request to the endpoint. With userdefined context, you can create multiple remote functions but re-use a singleendpoint, that provides different behaviors based on the context passed to it.
The following examples create two remote functions to encrypt and decryptBYTES data using the same endpoint.
CREATEFUNCTION`PROJECT_ID.DATASET_ID`.encrypt(xBYTES)RETURNSBYTESREMOTEWITHCONNECTION`PROJECT_ID.LOCATION.CONNECTION_NAME`OPTIONS(endpoint='ENDPOINT_URL',user_defined_context=[("mode","encryption")])CREATEFUNCTION`PROJECT_ID.DATASET_ID`.decrypt(xBYTES)RETURNSBYTESREMOTEWITHCONNECTION`PROJECT_ID.LOCATION.CONNECTION_NAME`OPTIONS(endpoint='ENDPOINT_URL',user_defined_context=[("mode","decryption")])Limiting number of rows in a batch request
You can specifymax_batching_rows inOPTIONS as the maximum number of rowsin each HTTP request, to avoidCloud Run functions timeout. If youspecifymax_batching_rows, BigQuery determines the number ofrows in a batch up to themax_batching_rows limit. If not specified,BigQuery determines the number of rows to batch automatically.
Use a remote function in a query
Make sure you havegranted the permission on your Cloud Run function,so that it is accessible to BigQuery's service account associatedwith the connection of the remote function.
You also need to have the permissionbigquery.routines.get on the datasetwhere the remote function is, and thebigquery.connections.use permission,which you can get through theBigQuery Connection User role, onthe connection which is used by the remote function.
You can use a remote function in a query just like auser defined function.
For example, you can use theremote_add function in the example query:
SELECT val,`PROJECT_ID.DATASET_ID`.remote_add(val, 2)FROM UNNEST([NULL,2,3,5,8]) AS val;This example produces the following output:
+------+-----+| val | f0_ |+------+-----+| NULL | 2 || 2 | 4 || 3 | 5 || 5 | 7 || 8 | 10 |+------+-----+internal trafficingress settings,you can either use the same Cloud Run functions endpoint project to run theBigQuery query orsetup a VPC-SC perimeter.Supported regions
There are two types of locations in BigQuery:
Aregion is a specific geographic place, such as London.
Amulti-region is a large geographic area, such as the United States, thatcontains two or more geographic places.
Single regions
In a BigQuery single region dataset, you can only create a remotefunction that uses a Cloud Run function deployed in the same region. Forexample:
- A remote function in BigQuery single region
us-east4canonly use a Cloud Run function inus-east4.
So for single regions, remote functions are only supported in regions thatsupport both Cloud Run functions and BigQuery.
Multi-regions
In a BigQuery multi-region (US,EU) dataset, you can onlycreate a remote function that uses a Cloud Run function deployed in aregion within the same large geographic area (US, EU). For example:
- A remote function in BigQuery
USmulti-region can only usea Cloud Run function deployed in any single region in the USgeographic area, such asus-central1,us-east4,us-west2, etc. - A remote function in BigQuery
EUmulti-region can only usea Cloud Run function deployed in any single region inmember states ofthe European Union, such aseurope-north1,europe-west3, etc.
For more information about BigQuery regions and multi-regions,see theDataset Locations page.For more information about Cloud Run functions regions, see theCloud Run functions Locations page.
Connections
For either a single-region location or multi-region location, you can onlycreate a remote function in the same location as the connection you use. Forexample, to create a remote function in theUS multi-region, use a connectionlocated in theUS multi-region.
Pricing
StandardBigQuery pricing applies.
In addition, costs may be incurred for Cloud Run functions andCloud Run by using this feature. Please review theCloud Run functions andCloud Run pricing pages for details.
Using VPC Service Controls
VPC Service Controls is a Google Cloud feature that allowsyou to set up a secure perimeter to guard against data exfiltration. To useVPC Service Controls with remote functions for additional security, or to useendpoints withinternal trafficingress settings,follow theVPC Service Controls guide to:
Create a service perimeter.
Add the BigQuery project of the query using the remote function into the perimeter.
Add the endpoint project into the perimeter and set
Cloud Functions APIorCloud Run APIin the restricted services based on your endpoint type. For more details, seeCloud Run functions VPC Service Controls andCloud Run VPC Service Controls.
Best practices for remote functions
Prefilter your input: If your input can be easily filtered down before being passed to a remotefunction, your query will likely be faster and cheaper.
Keep your Cloud Run function scalable. Scalability is a function ofminimum instances,maximum instances, andconcurrency.
- Where possible, use the defaultvalue for your Cloud Run function's maximum number of instances.
- Note that there is no default limit for 1st gen HTTP Cloud Run functions. To avoidunbounded scaling events with 1st gen HTTP Cloud Run functions while testingor in production, we recommendsetting a limit,for example, 3000.
Follow otherCloud Run function tips for better performance. Remote function queries interacting with a high latency Cloud Run function might fail due to timeout.
Implement your endpoint to return a right HTTP response code and payload for afailed response.
To minimize retries from BigQuery, use HTTP response codesother than 408, 429, 500, 503 and 504 for a failed response, and make sure tocatch all exceptions in your function code. Otherwise, the HTTP serviceframework may automatically return 500 for any uncaught exceptions.You might still see retried HTTP requests when BigQuery retriesa failed data partition or query.
Your endpoint should return a JSON payload in the defined format for afailed response. Even not strictly required, it helpsBigQuery distinguish whether the failed response is from yourfunction implementation or the infrastructure ofCloud Run functions/Cloud Run. For the latter,BigQuery may retry with a different internal limit.
Quotas
Use the following information to troubleshoot quota issues with remotefunctions.
Maximum number of concurrent queries that contain remote functions
BigQuery returns this error when the number of concurrentqueries that contain remote functions exceeds the limit.
To learn more about remote functions limits, seeRemote functions.
Error message
Exceeded rate limits: too many concurrent queries with remote functions forthis project
This limit can be increased. Try the workarounds and best practices first.
Diagnosis
To see limits for concurrent queries that containremotefunctions, seeRemote function limits.
Resolution
- When using remote functions, adhere tobest practices for remotefunctions.
- You can request a quota increase by contactingsupport orsales. It might take several days to review andprocess the request. We recommend stating the priority, use case, and theproject ID in the request.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.