The AI.GENERATE function
This document describes theAI.GENERATE function, which lets youanalyze any combination of structured and unstructured data. You can choose togenerate text orstructured output according to acustom schema that you specify. The function works by sending requests to aVertex AI Geminimodel and returning aSTRUCT that containsyour generated data, the full model response, and a status.
For example, the following query generates summaries of BBC news articles:
SELECTtitle,AI.GENERATE(CONCAT("Summarize in one sentence: ",body)).resultASarticle_summaryFROM`bigquery-public-data.bbc_news.fulltext`LIMIT3;You can also use theAI.GENERATE function to extract structured output. Forexample, you can use a query similar to the following to extract a patient'sname, age, and phone number from an unstructured description:
SELECTAI.GENERATE(patient_description,output_schema=>'name STRING, age INT64, phone_number STRING')FROMmydataset.patient_data;Input
Using theAI.GENERATE function, you can use the following typesof input:
- Text data from standard tables.
ObjectRefRuntimevalues that are generated by theOBJ.GET_ACCESS_URLfunction.You can useObjectRefvalues from standard tables as input to theOBJ.GET_ACCESS_URLfunction.(Preview)
When you analyze unstructured data, that data must meet the followingrequirements:
- Content must be in one of the supported formats that aredescribed in the Gemini API model
mimeTypeparameter. - If you are analyzing a video, the maximum supported length is two minutes.If the video is longer than two minutes,
AI.GENERATEonly returnsresults based on the first two minutes.
Prompt design can strongly affect the responses returned by themodel. For more information, seeIntroduction to prompting.
Syntax
AI.GENERATE([prompt=>]'PROMPT',[,endpoint=>'ENDPOINT'][,model_params=>MODEL_PARAMS][,output_schema=>'OUTPUT_SCHEMA'][,connection_id=>'CONNECTION'][,request_type=>'REQUEST_TYPE'])
Arguments
AI.GENERATE takes the following arguments:
PROMPT: aSTRINGorSTRUCTvalue that specifies thePROMPTvalue to send to the model. The prompt must be the first argument that you specify. You can provide the value in the following ways:- Specify a
STRINGvalue. For example, 'This is a prompt.' - Specify a
STRUCTvalue that contains one or more fields. You can use the following types of fields within theSTRUCTvalue:
The function combinesField type Description Examples STRING
orARRAY<STRING>A string literal, array of string literals, or the name of a STRINGcolumn.String literal: 'This is a prompt.'
String column name:my_string_columnObjectRefRuntime
orARRAY<ObjectRefRuntime>An
ObjectRefRuntimevalue returned by theOBJ.GET_ACCESS_URLfunction. TheOBJ.GET_ACCESS_URLfunction takes anObjectRefvalue as input, which you can provide by either specifying the name of a column that containsObjectRefvalues, or by constructing anObjectRefvalue.ObjectRefRuntimevalues must have theaccess_url.read_urlanddetails.gcs_metadata.content_typeelements of the JSON value populated.Your input can contain at most one video object.
Function call with ObjectRefcolumn:OBJ.GET_ACCESS_URL(my_objectref_column, 'r')
Function call with constructedObjectRefvalue:OBJ.GET_ACCESS_URL(OBJ.MAKE_REF('gs://image.jpg', 'myconnection'), 'r')STRUCTfields similarly to aCONCAToperation and concatenates the fields in their specified order. The same is true for the elements of any arrays used within the struct. The following table shows some examples ofSTRUCTprompt values and how they are interpreted:Struct field types Struct value Semantic equivalent STRUCT<STRING, STRING, STRING>('Describe the city of ', my_city_column, ' in 15 words')'Describe the city ofmy_city_column_value in 15 words' STRUCT<STRING, ObjectRefRuntime>('Describe the following city', OBJ.GET_ACCESS_URL(image_objectref_column, 'r'))'Describe the following cityimage'
- Specify a
ENDPOINT: aSTRINGvalue that specifies the Vertex AIendpoint to use for the model. You can specify anygenerally availableorpreviewGemini model. If you specify the model name,BigQuery ML automatically identifies and uses the full endpointof the model. If you don't specify anENDPOINTvalue,BigQuery ML selects a recent stable version ofGemini to use. The default endpoint isgemini-2.5-flash.You can also specify theglobal endpoint.For example, to usegemini-3-pro-preview, specify the following endpoint: Note: Don't use the global endpoint if you have requirements for the dataprocessing location, because when you use the global endpoint, you can'tcontrol or know the region where your processing requests are handled.Note: Using Gemini 2.5 models incurs charges for thethinking process.You can set a budget for the thinking process forGemini 2.5 Flash and Gemini 2.5 Flash-Lite models by usingthehttps://aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/global/publishers/google/models/gemini-3-pro-previewmodel_paramsargument to set thethinking_budgetparameter.For an example, seeSet the thinking budget for a Gemini 2.5 Flash model.You can't set a budget for Gemini 2.5 Pro models.MODEL_PARAMS: aJSONliteral that provides additional parameters tothe model. TheMODEL_PARAMSvalue must conform to thegenerateContentrequest body format.You can provide a value for any field in the request body except for thecontentsfield; thecontentsfield is populated with thePROMPTargument value.OUTPUT_SCHEMA: aSTRINGvalue that specifies theschema of the output as a comma-separated list of fields. Each field consistsof a name, a data type, and an optionalOPTIONSclause in which you canspecify a description of the field.The following example shows how to specify an output schema that containstwo string fields,nameandstate, with a description for thestatefield:OUTPUT_SCHEMA => '''name STRING,state STRING OPTIONS(description = 'The 2-letter abbreviation of the state name')'''
Supported data types include
STRING,INT64,FLOAT64,BOOL,ARRAY,andSTRUCT. For aSTRUCTdata type, you can specify theOPTIONSclauseon any of its subfields:OUTPUT_SCHEMA => '''location STRUCT<city STRING, state STRING OPTIONS(description = 'The 2-letter abbreviation of the state name')>'''
CONNECTION: aSTRINGvalue specifying the connectionto use to communicate with the model, in the format[PROJECT_ID].LOCATION.CONNECTION_ID.For example,myproject.us.myconnection.If you don't specify a connection, then the query uses yourend-user credentials.
For information about configuring permissions, seeSet permissions for BigQuery ML generative AI functions that call Vertex AI models.
REQUEST_TYPE: aSTRINGvalue that specifies the type of inferencerequest to send to the Gemini model. The request typedetermines what quota the request uses. Valid values are asfollows:SHARED: The function only usesdynamic shared quota (DSQ).DEDICATED: The function only usesProvisioned Throughput quota. The function returns an invalidquery error if Provisioned Throughput quota isn't available. For more information,seeUse Vertex AI Provisioned Throughput.UNSPECIFIED: The function uses quota as follows:- If you haven't purchased Provisioned Throughput quota,the function uses DSQ quota.
- If you have purchased Provisioned Throughput quota,the function uses the Provisioned Throughputquota first. If requests exceed the Provisioned Throughputquota, the overflow traffic uses DSQ quota.
The default value is
UNSPECIFIED.
Output
AI.GENERATE returns aSTRUCT value for each row in the table. The structcontains the following fields:
result: aSTRINGvalue containing the model's response to the prompt. Theresult isNULLif the request fails or is filtered byresponsible AI. If youspecify an output schema thenresultis replaced by your custom schema.full_response: a JSON value containing theresponsefrom theprojects.locations.endpoints.generateContentcall to the model. The generated text is in thetextelement.status: aSTRINGvalue that contains the API responsestatus for the corresponding row. This value is empty if the operation wassuccessful.
Examples
The following examples assume that you have granted theVertex AI User roleto your personal account. For more information, seeRun generative AI queries with end-user credentials.
Translate
The following query translates publicly available BBC news technology articlesinto French:
SELECTbody,AI.GENERATE(CONCAT("Translate into French ",body)).resultAStranslationFROM`bigquery-public-data.bbc_news.fulltext`WHEREcategory='tech'LIMIT3;
The result is similar to the following:

Use structured output for entity extraction
The following query extracts information about a person from an unstructureddescription. The query uses theoutput_schema argument to set custom fields inthe output:
SELECTAI.GENERATE(input,output_schema=>'''name STRING, age INT64, address STRUCT<street_address STRING, city STRING, state STRING, zip_code STRING>, is_married BOOL, phone_number ARRAY<STRING>, weight_in_pounds FLOAT64''')ASinfoFROM(SELECT'''John Smith is a 20-year old single man living at 1234 NW 45th St, Kirkland WA, 98033. He has two phone numbers 123-123-1234, and 234-234-2345. He is 200.5 pounds.'''ASinput);The result is similar to the following:
+------------+----------+-----------------------------+-------------------+-----+| info.name | info.age | info.address.street_address | info.address.city | ... |+------------+----------+-----------------------------+-------------------+-----+| John Smith | 20 | 1234 NW 45th St | Kirkland | ... |+------------+----------+-----------------------------+-------------------+-----+
The following query extracts information about customer complaints. The queryuses theoutput_schema argument to set custom fields in the output:
SELECTcomplaint_id,AI.GENERATE(CONCAT('Analyze the following complaint: ',consumer_complaint_narrative),output_schema=>""" grievance_subject ARRAY<STRING> OPTIONS(description = 'a list of grievance subjects'), complaint_type STRING OPTIONS(description = 'classify the complaint type as Billing Dispute, Service Issue, or Reporting Error')""").*EXCEPT(full_response,status)FROM`bigquery-public-data.cfpb_complaints.complaint_database`WHEREconsumer_complaint_narrativeISNOTNULLANDLENGTH(consumer_complaint_narrative) >100-- Ensure there's a narrative to analyzeLIMIT3;The result is similar to the following:
+--------------+-----------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+| complaint_id | complaint_type | grievance_subject |+--------------+-----------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+| 4767874 | Reporting Error | ["Inaccurate debt reporting on credit report","Enhanced Recovery Company reporting debt they no longer own","Failure to delete debt from credit report after recall"] || 2091987 | Service Issue | ["Unwanted calls to place of employment","Request for correspondence via mail"] || 6047403 | Reporting Error | ["Inaccurate derogatory collection on credit report","Disputed debt collection account","Lack of debt validation documentation","Refusal to remove incorrect collection from credit report"] |+--------------+-----------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
Process images in a Cloud Storage bucket
The following query creates an external table from images of pet productsstored in a publicly available Cloud Storage bucket:
CREATESCHEMAIFNOTEXISTSbqml_tutorial;CREATEORREPLACEEXTERNALTABLEbqml_tutorial.product_imagesWITHCONNECTIONDEFAULTOPTIONS(object_metadata='SIMPLE',uris=['gs://cloud-samples-data/bigquery/tutorials/cymbal-pets/images/*.png']);
You can useAI.GENERATE to describe images and what's in them.To do that, construct your prompt from a natural language instructionand anObjectRefRuntime of the image. The following query asksGemini what each image is. It specifies anoutput_schema to structure the results with one column toname the items in the image and another column to providea description of the image.
SELECTuri,STRING(OBJ.GET_ACCESS_URL(ref,'r').access_urls.read_url)ASsigned_url,AI.GENERATE(("What is this: ",OBJ.GET_ACCESS_URL(ref,'r')),output_schema=>"image_description STRING, entities_in_the_image ARRAY<STRING>").*FROMbqml_tutorial.product_imagesWHEREuriLIKE"%aquarium%";
This result is similar to the following:

Use grounding
The following queries shows how to set themodel_params argument to useGoogle Search or Google Maps grounding for the request.You can only use grounding with Gemini 2.0 or later models.
Set Google Search grounding:
SELECTname,AI.GENERATE(('Please check the weather of ',name,' for today.'),model_params=>JSON'{"tools": [{"googleSearch": {}}]}')FROMUNNEST(['Seattle','NYC','Austin'])ASname;
Set Google Maps grounding:
SELECTname,AI.GENERATE(('Please find some tourist attractions in ',name),model_params=>JSON'{"tools": [{"googleMaps": {}}]}')FROMUNNEST(['Seattle','NYC','Austin'])ASname;
Disable the thinking budget
The following query shows how to use themodel_params argument to set themodel's thinking budget to0 for the request:
SELECTAI.GENERATE(('What is the capital of Monaco?'),endpoint=>'gemini-2.5-flash',model_params=>JSON'{"generation_config":{"thinking_config": {"thinking_budget": 0}}}');
Use a context cache and low thinking level
The following query shows how to use themodel_params argument to specifyacontext cacheand use a low thinking level:
SELECTAI.GENERATE("Give me a summary of the document in context.",endpoint=>"projects/PROJECT_NUMBER/locations/global/publishers/google/models/gemini-3-flash-preview",model_params=>JSON'''{"cachedContent": "projects/PROJECT_NUMBER/locations/LOCATION/cachedContents/CACHED_CONTENT_ID", "generation_config": {"thinking_config":{"thinking_level": "LOW"}}}''')
Best Practices
This function passes your input to a Gemini model andincurs charges in Vertex AI each time it's called.For information about how to view these charges, seeTrack costs.To minimize Vertex AI charges when you useAI.GENERATE ona subset of data using theLIMIT clause, materialize the selected data to atable first. For example, the first of the following examples is preferable tothe second one:
CREATETABLEmydataset.citiesAS(SELECTcity_namefrommydataset.customersLIMIT10);SELECTcity,AI.GENERATE(('Give a short, one sentence description of ',city)).resultFROMmydataset.cities;
SELECTcity,AI.GENERATE(('Give a short, one sentence description of ',city)).resultFROM(SELECTcity_namefrommydataset.customersLIMIT10);
Writing the query results to a table beforehand helps you to ensure that youare sending as few rows as possible to the model.
Use Vertex AI Provisioned Throughput
You can useVertex AI Provisioned Throughputwith theAI.GENERATE function to provide consistent high throughput forrequests. The remote model that you reference in theAI.GENERATE functionmust use asupported Gemini modelin order for you to use Provisioned Throughput.
To use Provisioned Throughput,calculate your Provisioned Throughput requirementsand thenpurchase Provisioned Throughputquota before running theAI.GENERATE function. When you purchaseProvisioned Throughput, do the following:
- ForModel, select the same Gemini model as the one usedby the remote model that you reference in the
AI.GENERATEfunction. ForRegion, select the same region as the dataset that containsthe remote model that you reference in the
AI.GENERATEfunction, withthe following exceptions:- If the dataset is in the
USmulti-region, select theus-central1region. - If the dataset is in the
EUmulti-region, select theeurope-west4region.
- If the dataset is in the
After you submit the order, wait for the order to be approved and appear on theOrders page.
After you have purchased Provisioned Throughput quota, use theREQUEST_TYPE argument to determine how theAI.GENERATE function usesthe quota.
Locations
You can runAI.GENERATE in all of theregionsthat support Gemini models, and also in theUS andEUmulti-regions.
Quotas
SeeVertex AI and Cloud AI service functions quotas and limits.
What's next
- For more information about using Vertex AI models togenerate text and embeddings, seeGenerative AI overview.
- For more information about using Cloud AI APIs to perform AI tasks, seeAI application overview.
- For more information about supported SQL statements and functions forgenerative AI models, seeEnd-to-end user journeys for generative AI models.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.