The CREATE MODEL statement for Vertex AI LLMs as MaaS

This document describes theCREATE MODEL statement for creating remote modelsin BigQuery over LLMs in Vertex AI as a model as a service(MaaS) by using SQL. When you use MaaS on Vertex AI, you don'thave to provision or manage serving infrastructure for your models. ChooseMaaS for rapid development and prototyping, when you want to minimizeoperational overhead.Vertex AI offers access toGoogle models,partner modelsandopen modelsusing MaaS. For more information, seeWhen to use MaaS.

Alternatively, you can use the Google Cloud console user interface tocreate a model by using a UI(Preview) instead of constructing the SQLstatement yourself.

After you create the remote model, you can use one of the following functionsto perform generative AI with that model:

CREATE MODEL syntax

{CREATE MODEL |CREATE MODEL IF NOT EXISTS |CREATE OR REPLACE MODEL}`project_id.dataset.model_name`REMOTE WITH CONNECTION {DEFAULT | `project_id.region.connection_id`}OPTIONS(ENDPOINT = 'vertex_ai_llm_endpoint');

CREATE MODEL

Creates and trains a new model in the specified dataset. If the model nameexists,CREATE MODEL returns an error.

CREATE MODEL IF NOT EXISTS

Creates and trains a new model only if the model doesn't exist in thespecified dataset.

CREATE OR REPLACE MODEL

Creates and trains a model and replaces an existing model with the same name inthe specified dataset.

model_name

The name of the model you're creating or replacing. The modelname must be unique in the dataset: no other model or table can have the samename. The model name must follow the same naming rules as aBigQuery table. A model name can:

  • Contain up to 1,024 characters
  • Contain letters (upper or lower case), numbers, and underscores

model_name is case-sensitive.

If you don't have a default project configured, then you must prepend theproject ID to the model name in the following format, including backticks:

`[PROJECT_ID].[DATASET].[MODEL]`

For example, `myproject.mydataset.mymodel`.

REMOTE WITH CONNECTION

Syntax

`[PROJECT_ID].[LOCATION].[CONNECTION_ID]`

BigQuery uses aCloud resource connectionto interact withthe Vertex AI endpoint.

The connection elements are as follows:

  • PROJECT_ID: the project ID of the project that contains the connection.
  • LOCATION: thelocation used by the connection. The connection must be in the same location as the dataset that contains the model.
  • CONNECTION_ID: the connection ID—for example,myconnection.

    To find your connection ID,view the connection details in the Google Cloud console. The connection ID is the value in the last section of the fully qualified connection ID that is shown inConnection ID—for exampleprojects/myproject/locations/connection_location/connections/myconnection.

    To use adefault connection, specifyDEFAULT instead of the connection string containingPROJECT_ID.LOCATION.CONNECTION_ID.

If you are creating a remote model over a Vertex AI model thatuses supervised tuning, you need to grant theVertex AI Service Agent roleto the connection's service account in the project where you create the model.Otherwise, you need to grant theVertex AI User roleto the connection's service account in the project where you create the model.

If you are using the remote model to analyze unstructured data from anobject table, you must also grant theVertex AI Service Agent roleto the service account of the connection associated with the object table.You can find the object table's connection in the Google Cloud console, on theDetails pane for the object table.

Example

`myproject.us.my_connection`

ENDPOINT

Syntax

ENDPOINT = 'vertex_ai_llm_endpoint'

Description

The Vertex AI endpoint for the remote model to use. You canspecify the name of the Vertex AI model, for examplegemini-2.5-flash, or you can specify the Vertex AI model'sendpoint URL, for examplehttps://europe-west6-aiplatform.googleapis.com/v1/projects/myproject/locations/europe-west6/publishers/google/models/gemini-2.5-flash. If you specify the model name, BigQuery MLautomatically identifies and uses the full endpoint of theVertex AI model based on the location of the dataset in whichyou create the model.

Arguments

ASTRING value that contains the model name of the targetVertex AI LLM. The following LLMs are supported:

Pretrained Gemini models

All of thegenerally availableandpreviewGemini models are supported.

Note: To provide feedback or request support for the models in preview,send an email tobqml-feedback@google.com.

Forsupported Gemini models,you can specify theglobal endpoint,as shown in the following example:

https://aiplatform.googleapis.com/v1/projects/test-project/locations/global/publishers/google/models/gemini-2.5-flash

Using the global endpoint for your requests can improve overallavailability while reducing resource exhausted (429) errors, which occurwhen you exceed your quota for a regional endpoint.If you want to use Gemini in a region where it isn'tavailable, you can avoid migrating your data to a different region byusing the global endpoint instead. You can onlyuse the global endpoint with theAI.GENERATE_TEXT function.

Note: Don't use the global endpoint if you have requirements for the dataprocessing location, because when you use the global endpoint, you can'tcontrol or know the region where your processing requests are handled.Note: Using Gemini 2.5 models with any of these functions incurscharges for thethinking process.With some functions, you can set a budget for the thinking process forGemini 2.5 Flash and Gemini 2.5 Flash-Lite models. You can'tset a budget for Gemini 2.5 Pro models. See the documentation fora given function for details.

Claude models

The followingAnthropic Claude modelsare supported:

  • claude-haiku-4-5
  • claude-sonnet-4-5
  • claude-opus-4-1
  • claude-opus-4
  • claude-sonnet-4
  • claude-3-7-sonnet
  • claude-3-5-haiku
  • claude-3-haiku

You must enable Claude models in Vertex AI before you can usethem. For more information, seeEnable a partner model.

Although Claude models are multimodal, you can only use text input withClaude models in BigQuery ML.

After you create a remote model based on a Claude model, you can use themodel with theAI.GENERATE_TEXT functionto generate text based on a prompt you provide in a query or from a column in astandard table.

Mistral AI models

The followingMistral AI modelsare supported:

  • mistral-large-2411
  • mistral-nemo
  • mistral-small-2503

Don't use a version suffix with any Mistral AI model.

You must enable Mistral AI models in Vertex AI before you can usethem. For more information, seeEnable a partner model.

After you create a remote model based on a Mistral AI model, you can use themodel with theAI.GENERATE_TEXT functionto generate text based on a prompt you provide in a query or from a column in astandard table.

Llama models as MaaS

To create a Llama model in BigQuery ML, you must specify it asanOpenAI APIendpoint in the formatopenapi/<publisher_name>/<model_name>.

The followingLlama modelsare supported:

  • Llama 4 Scout 17B-16E, endpointmeta/llama-4-scout-17b-16e-instruct-maas
  • Llama 4 Maverick 17B-128E, endpointmeta/llama-4-maverick-17b-128e-instruct-maas
  • Llama 3.3 70B (Preview), endpointopenapi/meta/llama-3.3-70b-instruct-maas
  • Llama 3.2 90B (Preview), endpointopenapi/meta/llama-3.2-90b-vision-instruct-maas
  • Llama 3.1 405B (GA), endpointopenapi/meta/llama-3.1-405b-instruct-maas
  • Llama 3.1 70B (Preview), endpointopenapi/meta/llama-3.1-70b-instruct-maas
  • Llama 3.1 8B (Preview), endpointopenapi/meta/llama-3.1-8b-instruct-maas
Important: For Llama 4.0 and greater models, you must create the dataset andconnection for the remote model in the same region as the Llama modelendpoint.

You must enable Llama models in Vertex AI before you can usethem. For more information, seeEnable a partner model.

After you create a remote model based on a Llama model, you can use themodel with theAI.GENERATE_TEXT functionto generate text based on a prompt you provide in a query or from a column in astandard table.

For information that can help you choose between the supported models, seeModel information.

Locations

For information about supported locations, seeLocations for remote models.

Examples

The following examples create BigQuery ML remote models.

Create a Gemini model that uses the default connection

The following example creates a BigQuery ML remote model over aGemini model:

CREATE OR REPLACE MODEL `mydataset.gemini_model`REMOTE WITH CONNECTION DEFAULTOPTIONS(ENDPOINT = 'gemini-2.5-flash');

Create a partner model that uses the default connection

The following example creates a BigQuery ML remote model over aMistral AI model:

CREATE OR REPLACE MODEL `mydataset.mistral_model`REMOTE WITH CONNECTION DEFAULTOPTIONS(ENDPOINT = 'mistral-large-2411');

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.