Export a BigQuery ML model for online prediction Stay organized with collections Save and categorize content based on your preferences.
This tutorial shows how toexport a BigQuery ML modeland then deploy the model either on Vertex AI or on a local machine. You willuse theiris tablefrom the BigQuery public datasets and work through the following three end-to-end scenarios:
- Train and deploy a logistic regression model - also applies to DNN classifier, DNN regressor, k-means, linear regression, and matrix factorization models.
- Train and deploy a boosted tree classifier model - also applies to boosted tree regressor model.
- Train and deploy an AutoML classifier model - also applies to AutoML regressor model.
Costs
This tutorial uses billable components of Google Cloud,including:
- BigQuery ML
- Cloud Storage
- Vertex AI (optional, used for online prediction)
For more information about BigQuery ML costs, seeBigQuery ML pricing.
For more information about Cloud Storage costs, see theCloud Storage pricing page.
For more information about Vertex AI costs, seeCustom-trained models.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
- BigQuery is automatically enabled in new projects. To activate BigQuery in a pre-existing project, go to
Enable the BigQuery API.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles. Enable the AI Platform Training and Prediction API and Compute Engine APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.- Install theGoogle Cloud CLI and the Google Cloud CLI.
Create your dataset
Create a BigQuery dataset to store your ML model.
Console
In the Google Cloud console, go to theBigQuery page.
In theExplorer pane, click your project name.
ClickView actions > Create dataset
On theCreate dataset page, do the following:
ForDataset ID, enter
bqml_tutorial.ForLocation type, selectMulti-region, and then selectUS (multiple regions in United States).
Leave the remaining default settings as they are, and clickCreate dataset.
bq
To create a new dataset, use thebq mk commandwith the--location flag. For a full list of possible parameters, see thebq mk --dataset commandreference.
Create a dataset named
bqml_tutorialwith the data location set toUSand a description ofBigQuery ML tutorial dataset:bq --location=US mk -d \ --description "BigQuery ML tutorial dataset." \ bqml_tutorial
Instead of using the
--datasetflag, the command uses the-dshortcut.If you omit-dand--dataset, the command defaults to creating adataset.Confirm that the dataset was created:
bqls
API
Call thedatasets.insertmethod with a defineddataset resource.
{"datasetReference":{"datasetId":"bqml_tutorial"}}
BigQuery DataFrames
Before trying this sample, follow the BigQuery DataFrames setup instructions in theBigQuery quickstart using BigQuery DataFrames. For more information, see theBigQuery DataFrames reference documentation.
To authenticate to BigQuery, set up Application Default Credentials. For more information, seeSet up ADC for a local development environment.
importgoogle.cloud.bigquerybqclient=google.cloud.bigquery.Client()bqclient.create_dataset("bqml_tutorial",exists_ok=True)Train and deploy a logistic regression model
Use the following sections to learn how to train and deploy a logisticregression model.
Train the model
Train a logistic regression model that predicts iris type using theBigQuery MLCREATE MODELstatement. This training job should take approximately 1 minute to complete.
bqquery--use_legacy_sql=false\'CREATE MODEL `bqml_tutorial.iris_model` OPTIONS (model_type="logistic_reg", max_iterations=10, input_label_cols=["species"]) AS SELECT * FROM `bigquery-public-data.ml_datasets.iris`;'
Export the model
Export the model to a Cloud Storage bucket using thebq command-line tool. For additional ways to export models, seeExport BigQuery ML models. This extract job should take less than 1 minute to complete.
bq extract -m bqml_tutorial.iris_model gs://some/gcs/path/iris_modelLocal deployment and serving
You can deploy exported TensorFlow models using the TensorFlow Serving Dockercontainer. The following steps require you to installDocker.
Download the exported model files to a temporary directory
mkdir tmp_dirgcloud storage cp gs://some/gcs/path/iris_model tmp_dir --recursiveCreate a version subdirectory
This step sets a version number (1 in this case) for the model.
mkdir -p serving_dir/iris_model/1cp -r tmp_dir/iris_model/* serving_dir/iris_model/1rm -r tmp_dirPull the Docker image
docker pull tensorflow/servingRun the Docker container
dockerrun-p8500:8500--network="host"--mounttype=bind,source=`pwd`/serving_dir/iris_model,target=/models/iris_model-eMODEL_NAME=iris_model-ttensorflow/serving&Run the prediction
curl -d '{"instances": [{"sepal_length":5.0, "sepal_width":2.0, "petal_length":3.5, "petal_width":1.0}]}' -X POST http://localhost:8501/v1/models/iris_model:predictOnline deployment and serving
This section uses theGoogle Cloud CLI to deploy andrun predictions against the exported model.
For more information about deploying a model to Vertex AI foronline or batch predictions, seeDeploy a model to an endpoint.
Create a model resource
MODEL_NAME="IRIS_MODEL"gcloud ai-platform models create $MODEL_NAMECreate a model version
1) Set the environment variables:
MODEL_DIR="gs://some/gcs/path/iris_model"// Select a suitable version for this modelVERSION_NAME="v1"FRAMEWORK="TENSORFLOW"2) Create the version:
gcloud ai-platform versions create $VERSION_NAME --model=$MODEL_NAME --origin=$MODEL_DIR --runtime-version=1.15 --framework=$FRAMEWORKThis step might take a few minutes to complete. You should see the messageCreating version (this might take a few minutes).......
3) (optional) Get information about your new version:
gcloud ai-platform versions describe $VERSION_NAME --model $MODEL_NAMEYou should see output similar to this:
createTime: '2020-02-28T16:30:45Z'deploymentUri: gs://your_bucket_nameframework: TENSORFLOWmachineType: mls1-c1-m2name: projects/[YOUR-PROJECT-ID]/models/IRIS_MODEL/versions/v1pythonVersion: '2.7'runtimeVersion: '1.15'state: READYOnline prediction
For more information about about running online predictions against adeployed model, seeGet online inferences from a custom trained model.
1) Create a newline-delimited JSON file for inputs, for exampleinstances.jsonfile with the following content:
{"sepal_length":5.0, "sepal_width":2.0, "petal_length":3.5, "petal_width":1.0}{"sepal_length":5.3, "sepal_width":3.7, "petal_length":1.5, "petal_width":0.2}2) Setup env variables for predict:
INPUT_DATA_FILE="instances.json"3) Run predict:
gcloud ai-platform predict --model $MODEL_NAME --version $VERSION_NAME --json-instances $INPUT_DATA_FILETrain and deploy a boosted tree classifier model
Use the following sections to learn how to train and deploy a boosted treeclassifier model.
Train the model
Train a boosted tree classifier model that predicts iris type using theCREATE MODELstatement. This training job should take approximately 7 minutes to complete.
bqquery--use_legacy_sql=false\'CREATE MODEL `bqml_tutorial.boosted_tree_iris_model` OPTIONS (model_type="boosted_tree_classifier", max_iterations=10, input_label_cols=["species"]) AS SELECT * FROM `bigquery-public-data.ml_datasets.iris`;'
Export the model
Export the model to a Cloud Storage bucket using thebq command-line tool. For additional waysto export models, seeExport BigQuery ML models.
bq extract --destination_format ML_XGBOOST_BOOSTER -m bqml_tutorial.boosted_tree_iris_model gs://some/gcs/path/boosted_tree_iris_modelLocal deployment and serving
In the exported files, there is amain.py file for local run.
Download the exported model files to a local directory
mkdir serving_dirgcloud storage cp gs://some/gcs/path/boosted_tree_iris_model serving_dir --recursiveExtract predictor
tar -xvf serving_dir/boosted_tree_iris_model/xgboost_predictor-0.1.tar.gz -C serving_dir/boosted_tree_iris_model/Install XGBoost library
Install theXGBoost library - version 0.82 or later.
Run the prediction
cd serving_dir/boosted_tree_iris_model/python main.py '[{"sepal_length":5.0, "sepal_width":2.0, "petal_length":3.5, "petal_width":1.0}]'Online deployment and serving
This section uses theGoogle Cloud CLI to deploy andrun predictions against the exported model. For more information, seeGet online inferences from a custom trained model.
Note: For serving onVertex AI Prediction,followRequest Predictionsand use the following containers for your region respectively:1) us-docker.pkg.dev/vertex-ai/bigquery-ml/xgboost-cpu.1-0:latest2) europe-docker.pkg.dev/vertex-ai/bigquery-ml/xgboost-cpu.1-0:latest3) asia-docker.pkg.dev/vertex-ai/bigquery-ml/xgboost-cpu.1-0:latestFor more information about deploying a model to Vertex AI foronline or batch predictions using custom routines, seeDeploy a model to an endpoint.
Create a model resource
MODEL_NAME="BOOSTED_TREE_IRIS_MODEL"gcloud ai-platform models create $MODEL_NAMECreate a model version
1) Set the environment variables:
MODEL_DIR="gs://some/gcs/path/boosted_tree_iris_model"VERSION_NAME="v1"2) Create the version:
gcloudbetaai-platformversionscreate$VERSION_NAME--model=$MODEL_NAME--origin=$MODEL_DIR--package-uris=${MODEL_DIR}/xgboost_predictor-0.1.tar.gz--prediction-class=predictor.Predictor--runtime-version=1.15This step might take a few minutes to complete. You should see the messageCreating version (this might take a few minutes).......
3) (optional) Get information about your new version:
gcloud ai-platform versions describe $VERSION_NAME --model $MODEL_NAMEYou should see output similar to this:
createTime:'2020-02-07T00:35:42Z'deploymentUri:gs://some/gcs/path/boosted_tree_iris_modeletag:rp090ebEnQk=machineType:mls1-c1-m2name:projects/[YOUR-PROJECT-ID]/models/BOOSTED_TREE_IRIS_MODEL/versions/v1packageUris:-gs://some/gcs/path/boosted_tree_iris_model/xgboost_predictor-0.1.tar.gzpredictionClass:predictor.PredictorpythonVersion:'2.7'runtimeVersion:'1.15'state:READYOnline prediction
For more information about running online predictions against a deployedmodel, seeGet online inferences from a custom trained model.
1) Create a newline-delimited JSON file for inputs. For example,instances.jsonfile with the following content:
{"sepal_length":5.0, "sepal_width":2.0, "petal_length":3.5, "petal_width":1.0}{"sepal_length":5.3, "sepal_width":3.7, "petal_length":1.5, "petal_width":0.2}2) Set up environment variables for predict:
INPUT_DATA_FILE="instances.json"3) Run predict:
gcloud ai-platform predict --model $MODEL_NAME --version $VERSION_NAME --json-instances $INPUT_DATA_FILETrain and deploy an AutoML classifier model
Use the following sections to learn how to train and deploy an AutoMLclassifier model.
Train the model
Train an AutoML classifier model that predicts iris type using theCREATE MODELstatement. AutoML models need at least 1000 rows of input data. Becauseml_datasets.iris only has 150 rows, we duplicate the data 10 times. Thistraining job should take around2 hours to complete.
bqquery--use_legacy_sql=false\'CREATE MODEL `bqml_tutorial.automl_iris_model` OPTIONS (model_type="automl_classifier", budget_hours=1, input_label_cols=["species"]) AS SELECT * EXCEPT(multiplier) FROM `bigquery-public-data.ml_datasets.iris`, unnest(GENERATE_ARRAY(1, 10)) as multiplier;'
Export the model
Export the model to a Cloud Storage bucket using thebq command-line tool. For additional waysto export models, seeExporting BigQuery ML models.
bq extract -m bqml_tutorial.automl_iris_model gs://some/gcs/path/automl_iris_modelLocal deployment and serving
For details about building AutoML containers, seeExporting models. The following stepsrequire you to installDocker.
Copy exported model files to a local directory
mkdir automl_serving_dirgcloud storage cp gs://some/gcs/path/automl_iris_model/* automl_serving_dir/ --recursivePull AutoML Docker image
docker pull gcr.io/cloud-automl-tables-public/model_serverStart Docker container
dockerrun-v`pwd`/automl_serving_dir:/models/default/0000001-p8080:8080-itgcr.io/cloud-automl-tables-public/model_serverRun the prediction
1) Create a newline-delimited JSON file for inputs. For example,input.jsonfile with the following contents:
{"instances": [{"sepal_length":5.0, "sepal_width":2.0, "petal_length":3.5, "petal_width":1.0},{"sepal_length":5.3, "sepal_width":3.7, "petal_length":1.5, "petal_width":0.2}]}2) Make the predict call:
curl-XPOST--data @input.json http://localhost:8080/predictOnline deployment and serving
Online prediction for AutoML regressor and AutoML classifier models is not supported in Vertex AI.
Clean up
To avoid incurring charges to your Google Cloud account for the resources used in this tutorial, either delete the project that contains the resources, or keep the project and delete the individual resources.
- You can delete the project you created.
- Or you can keep the project and delete the dataset and Cloud Storage bucket.
Stop Docker container
1) List all running Docker containers.
docker ps2) Stop the container with the applicable container ID from the container list.
docker stop container_idDelete Vertex AI resources
1) Delete the model version.
gcloud ai-platform versions delete $VERSION_NAME --model=$MODEL_NAME2) Delete the model.
gcloud ai-platform models delete $MODEL_NAMEDelete your dataset
Deleting your project removes all datasets and all tables in the project. If youprefer to reuse the project, you can delete the dataset you created in thistutorial:
If necessary, open the BigQuery page in theGoogle Cloud console.
In the navigation, click thebqml_tutorial dataset you created.
ClickDelete dataset on the right side of the window.This action deletes the dataset, the table, and all the data.
In theDelete dataset dialog, confirm the delete command by typingthe name of your dataset (
bqml_tutorial) and then clickDelete.
Delete your Cloud Storage bucket
Deleting your project removes all Cloud Storage buckets in the project. Ifyou prefer to reuse the project, you can delete the bucket you created in thistutorial
- In the Google Cloud console, go to the Cloud StorageBuckets page.
Select the checkbox of the bucket you want to delete.
ClickDelete.
In the overlay window that appears, confirm you want to delete thebucket and its contents by clickingDelete.
Delete your project
To delete the project:
What's next
- For an overview of BigQuery ML, seeIntroduction to BigQuery ML.
- For information on exporting models, seeExport models.
- For information on creating models, see the
CREATE MODELsyntax page.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.