Tutorial: Run inference on an object table by using a classification model
This tutorial shows you how to create an object table based on the imagesfrom a public dataset,and then run inference on that object table using theResNet 50 model.
The ResNet 50 model
The ResNet 50 model analyzes image files and outputs a batch of vectorsrepresenting the likelihood that an image belongs the corresponding class(logits). For more information, see theUsage section on themodel's TensorFlow Hub page.
The ResNet 50 model input takes a tensor ofDType =float32 in the shape[-1, 224, 224, 3]. The output is an array oftensors oftf.float32 in the shape[-1, 1024].
Required permissions
- To create the dataset, you need the
bigquery.datasets.createpermission. To create the connection resource, you need the following permissions:
bigquery.connections.createbigquery.connections.get
To grant permissions to the connection's service account, you need thefollowing permission:
resourcemanager.projects.setIamPolicy
To create the object table, you need the following permissions:
bigquery.tables.createbigquery.tables.updatebigquery.connections.delegate
To create the bucket, you need the
storage.buckets.createpermission.To upload the model to Cloud Storage, you need the
storage.objects.createandstorage.objects.getpermissions.To load the model into BigQuery ML, you need the followingpermissions:
bigquery.jobs.createbigquery.models.createbigquery.models.getDatabigquery.models.updateData
To run inference, you need the following permissions:
bigquery.tables.getDataon the object tablebigquery.models.getDataon the modelbigquery.jobs.create
Costs
In this document, you use the following billable components of Google Cloud:
- BigQuery: You incur storage costs for the object table you create in BigQuery.
- BigQuery ML: You incur costs for the model you create and the inference you perform in BigQuery ML.
- Cloud Storage: You incur costs for the objects you store in Cloud Storage.
To generate a cost estimate based on your projected usage, use thepricing calculator.
For more information on BigQuery storage pricing, seeStorage pricing in the BigQuerydocumentation.
For more information on BigQuery ML pricing, seeBigQuery ML pricing inthe BigQuery documentation.
For more information on Cloud Storage pricing, see theCloud Storage pricing page.
Before you begin
- Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
Enable the BigQuery and BigQuery Connection API APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
Enable the BigQuery and BigQuery Connection API APIs.
Roles required to enable APIs
To enable APIs, you need the Service Usage Admin IAM role (
roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enablepermission.Learn how to grant roles.
Create a reservation
To use animported modelwith an object table, you mustcreate a reservationthat uses the BigQueryEnterprise or Enterprise Plus edition,and thencreate a reservation assignmentthat uses theQUERY job type.
Create a dataset
Create a dataset namedresnet_inference_test:
SQL
Go to theBigQuery page.
In theEditor pane, run the following SQL statement:
CREATESCHEMA`PROJECT_ID.resnet_inference_test`;
Replace
PROJECT_IDwith your project ID.
bq
In the Google Cloud console, activate Cloud Shell.
Run the
bq mkcommandto create the dataset:bqmk--dataset--location=usPROJECT_ID:resnet_inference_testReplace
PROJECT_IDwith your project ID.
Create a connection
Create a connection namedlake-connection:
Console
Go to theBigQuery page.
In the left pane, clickExplorer:

If you don't see the left pane, clickExpand left pane to open the pane.
In theExplorer pane, clickAdd data.
TheAdd data dialog opens.
In theFilter By pane, in theData Source Type section, selectDatabases.
Alternatively, in theSearch for data sources field, you can enter
Vertex AI.In theFeatured data sources section, clickVertex AI.
Click theVertex AI Models: BigQuery Federation solution card.
In theConnection type list, selectVertex AI remote models, remote functions, BigLake and Spanner (Cloud Resource).
In theConnection ID field, type
lake-connection.ClickCreate connection.
In theConnection info pane, copy the value from theService account id field and save it somewhere. You need thisinformation togrant permissions to the connection'sservice account.
bq
In Cloud Shell, run the
bq mkcommandto create the connection:bqmk--connection--location=us--connection_type=CLOUD_RESOURCE\lake-connectionRun the
bq showcommandto retrieve information about the connection:bq show --connection us.lake-connectionFrom the
propertiescolumn, copy the value of theserviceAccountIdproperty and save it somewhere. You need this information togrant permissions to the connection'sservice account.
Create a Cloud Storage bucket
Create a Cloud Storage bucket tocontain the model files.
Grant permissions to the connection's service account
Console
Go to theIAM & Admin page.
ClickGrant Access.
TheAdd principals dialog opens.
In theNew principals field, enter the service account ID that youcopied earlier.
In theSelect a role field, selectCloud Storage, and thenselectStorage Object Viewer.
ClickSave.
gcloud
In Cloud Shell, run thegcloud storage buckets add-iam-policy-binding command:
gcloudstoragebucketsadd-iam-policy-bindinggs://BUCKET_NAME\--member=serviceAccount:MEMBER\--role=roles/storage.objectViewer
ReplaceMEMBER with the service account ID that youcopied earlier. ReplaceBUCKET_NAME with the nameof the bucket you previously created.
For more information, seeAdd a principal to a bucket-levelpolicy.
Create an object table
Create an object table namedvision_images based on theimage files in the publicgs://cloud-samples-data/vision bucket:
SQL
Go to theBigQuery page.
In theEditor pane, run the following SQL statement:
CREATEEXTERNALTABLEresnet_inference_test.vision_imagesWITHCONNECTION`us.lake-connection`OPTIONS(object_metadata='SIMPLE',uris=['gs://cloud-samples-data/vision/*.jpg']);
bq
In Cloud Shell, run thebq mk commandto create the connection:
bqmk--table \--external_table_definition='gs://cloud-samples-data/vision/*.jpg@us.lake-connection' \--object_metadata=SIMPLE \resnet_inference_test.vision_imagesUpload the model to Cloud Storage
Get the model files and make them available in Cloud Storage:
- Downloadthe ResNet 50 model to your local machine. This gives you a
saved_model.pbfile and avariablesfolder for the model. - Upload the
saved_model.pbfile andthevariablesfolder to the bucket you previously created.
Load the model into BigQuery ML
Go to theBigQuery page.
In theEditor pane, run the following SQL statement:
CREATEMODEL`resnet_inference_test.resnet`OPTIONS(model_type='TENSORFLOW',model_path='gs://BUCKET_NAME/*');
Replace
BUCKET_NAMEwith the name of the bucketyou previously created.
Inspect the model
Inspect the uploaded model to see what its input and output fields are:
Go to theBigQuery page.
In the left pane, clickExplorer:

In theExplorer pane, expand your project, clickDatasets, and thenclick the
resnet_inference_testdataset.Go to theModels tab.
Click the
resnetmodel.In the model pane that opens, click theSchema tab.
Look at theLabels section. This identifies the fields that are outputby the model. In this case, the field name value is
activation_49.Look at theFeatures section. This identifies the fields that mustbe input into the model. You reference them in the
SELECTstatementfor theML.DECODE_IMAGEfunction. In this case, the field name value isinput_1.
Run inference
Run inference on thevision_images object table using theresnet model:
Go to theBigQuery page.
In theEditor pane, run the following SQL statement:
SELECT*FROMML.PREDICT(MODEL`resnet_inference_test.resnet`,(SELECTuri,ML.RESIZE_IMAGE(ML.DECODE_IMAGE(data),224,224,FALSE)ASinput_1FROMresnet_inference_test.vision_images));
The results should look similar to the following:
-------------------------------------------------------------------------------------------------------------------------------------| activation_49 | uri | input_1 |—------------------------------------------------------------------------------------------------------------------------------------| 1.0254175464297077e-07 | gs://cloud-samples-data/vision/automl_classification/flowers/daisy/21652746_cc379e0eea_m.jpg | 0.0 |—------------------------------------------------------------------------------------------------------------------------------------| 2.1671139620593749e-06 | | 0.0 |—-------------------------------------| 8.346052027263795e-08 | | 0.0 |—-------------------------------------| 1.159310958342985e-08 | | 0.0 |—------------------------------------------------------------------------------------------------------------------------------------
Clean up
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.