Tutorial: Run inference on an object table by using a feature vector model

This tutorial shows you how to create an object table based on the imagesfrom theflowers dataset,and then run inference on that object table using theMobileNet V3 model.

The MobileNet V3 model

The MobileNet V3 model analyzes image files and returns a feature vector array.The feature vector array is a list of numerical elements which describe thecharacteristics of the images analyzed. Each feature vector describes amulti-dimensional feature space, and provides the coordinates of the image inthis space. You can use the featurevector information for an image to further classify the image, for exampleby using cosine similarity to group similar images.

The MobileNet V3 model input takes a tensor ofDTypetf.float32 in the shape[-1, 224, 224, 3]. The output is an array oftensors oftf.float32 in the shape[-1, 1024].

Required permissions

  • To create the dataset, you need thebigquery.datasets.create permission.
  • To create the connection resource, you need the following permissions:

    • bigquery.connections.create
    • bigquery.connections.get
  • To grant permissions to the connection's service account, you need thefollowing permission:

    • resourcemanager.projects.setIamPolicy
  • To create the object table, you need the following permissions:

    • bigquery.tables.create
    • bigquery.tables.update
    • bigquery.connections.delegate
  • To create the bucket, you need thestorage.buckets.create permission.

  • To upload the dataset and model to Cloud Storage, you need thestorage.objects.create andstorage.objects.get permissions.

  • To load the model into BigQuery ML, you need the followingpermissions:

    • bigquery.jobs.create
    • bigquery.models.create
    • bigquery.models.getData
    • bigquery.models.updateData
  • To run inference, you need the following permissions:

    • bigquery.tables.getData on the object table
    • bigquery.models.getData on the model
    • bigquery.jobs.create

Costs

In this document, you use the following billable components of Google Cloud:

  • BigQuery: You incur storage costs for the object table you create in BigQuery.
  • BigQuery ML: You incur costs for the model you create and the inference you perform in BigQuery ML.
  • Cloud Storage: You incur costs for the objects you store in Cloud Storage.

To generate a cost estimate based on your projected usage, use thepricing calculator.

New Google Cloud users might be eligible for afree trial.

For more information on BigQuery storage pricing, seeStorage pricing in the BigQuerydocumentation.

For more information on BigQuery ML pricing, seeBigQuery ML pricing inthe BigQuery documentation.

For more information on Cloud Storage pricing, see theCloud Storage pricing page.

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  3. Verify that billing is enabled for your Google Cloud project.

  4. Enable the BigQuery and BigQuery Connection API APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  6. Verify that billing is enabled for your Google Cloud project.

  7. Enable the BigQuery and BigQuery Connection API APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

Create a reservation

To use animported modelwith an object table, you mustcreate a reservationthat uses the BigQueryEnterprise or Enterprise Plus edition,and thencreate a reservation assignmentthat uses theQUERY job type.

Create a dataset

Create a dataset namedmobilenet_inference_test:

SQL

  1. Go to theBigQuery page.

    Go to BigQuery

  2. In theEditor pane, run the following SQL statement:

    CREATESCHEMA`PROJECT_ID.mobilenet_inference_test`;

    ReplacePROJECT_ID with your project ID.

bq

  1. In the Google Cloud console, activate Cloud Shell.

    Activate Cloud Shell

  2. Run thebq mk commandto create the dataset:

    bqmk--dataset--location=usPROJECT_ID:resnet_inference_test

    ReplacePROJECT_ID with your project ID.

Create a connection

Create a connection namedlake-connection:

Console

  1. Go to theBigQuery page.

    Go to BigQuery

  2. In the left pane, clickExplorer:

    Highlighted button for the Explorer pane.

    If you don't see the left pane, clickExpand left pane to open the pane.

  3. In theExplorer pane, clickAdd data.

    TheAdd data dialog opens.

  4. In theFilter By pane, in theData Source Type section, selectDatabases.

    Alternatively, in theSearch for data sources field, you can enterVertex AI.

  5. In theFeatured data sources section, clickVertex AI.

  6. Click theVertex AI Models: BigQuery Federation solution card.

  7. In theConnection type list, selectVertex AI remote models, remote functions, BigLake and Spanner (Cloud Resource).

  8. In theConnection ID field, typelake-connection.

  9. ClickCreate connection.`

  10. In theExplorer pane, expand your project, clickConnections, and select theus.lake-connectionconnection.

  11. In theConnection info pane, copy the value from theService account id field. You need this information togrant permission to the connection's service accounton the Cloud Storage bucket that you create in the next step.

bq

  1. In Cloud Shell, run thebq mk commandto create the connection:

    bqmk--connection--location=us--connection_type=CLOUD_RESOURCE\lake-connection
  2. Run thebq show commandto retrieve information about the connection:

    bq show --connection us.lake-connection
  3. From theproperties column, copy the value of theserviceAccountIdproperty and save it somewhere. You need this information togrant permissions to the connection'sservice account.

Create a Cloud Storage bucket

  1. Create a Cloud Storage bucket.
  2. Create two foldersin the bucket, one namedmobilenet for the model files and one namedflowers for the dataset.

Grant permissions to the connection's service account

Console

  1. Go to theIAM & Admin page.

    Go to IAM & Admin

  2. ClickGrant Access.

    TheAdd principals dialog opens.

  3. In theNew principals field, enter the service account ID that youcopied earlier.

  4. In theSelect a role field, selectCloud Storage, and thenselectStorage Object Viewer.

  5. ClickSave.

gcloud

In Cloud Shell, run thegcloud storage buckets add-iam-policy-binding command:

gcloudstoragebucketsadd-iam-policy-bindinggs://BUCKET_NAME\--member=serviceAccount:MEMBER\--role=roles/storage.objectViewer

ReplaceMEMBER with the service account ID that youcopied earlier. ReplaceBUCKET_NAME with the nameof the bucket you previously created.

For more information, seeAdd a principal to a bucket-levelpolicy.

Note: There can be a delay of up to a minute before new permissions take effect.

Upload the dataset to Cloud Storage

Get the dataset files and make them available in Cloud Storage:

  1. Downloadthe flowers dataset to your local machine.
  2. Unzip theflower_photos.tgz file.
  3. Upload theflower_photos folderto theflowers folder in the bucket you previously created.
  4. Once the upload has completed, delete theLICENSE.txt file in theflower_photos folder.

Create an object table

Create an object table namedsample_images based on the flowers dataset youuploaded:

SQL

  1. Go to theBigQuery page.

    Go to BigQuery

  2. In theEditor pane, run the following SQL statement:

    CREATEEXTERNALTABLEmobilenet_inference_test.sample_imagesWITHCONNECTION`us.lake-connection`OPTIONS(object_metadata='SIMPLE',uris=['gs://BUCKET_NAME/flowers/*']);

    ReplaceBUCKET_NAME with the name of the bucket youpreviously created.

bq

In Cloud Shell, run thebq mk commandto create the connection:

bqmk--table\--external_table_definition='gs://BUCKET_NAME/flowers/*@us.lake-connection'\--object_metadata=SIMPLE\mobilenet_inference_test.sample_images

ReplaceBUCKET_NAME with the name of the bucket youpreviously created.

Upload the model to Cloud Storage

Get the model files and make them available in Cloud Storage:

  1. Downloadthe MobileNet V3 model to your local machine. This gives you asaved_model.pb file and avariables folder for the model.
  2. Upload thesaved_model.pb file andthevariables folder to themobilenet folder in the bucket youpreviously created.

Load the model into BigQuery ML

  1. Go to theBigQuery page.

    Go to BigQuery

  2. In theEditor pane, run the following SQL statement:

    CREATEMODEL`mobilenet_inference_test.mobilenet`OPTIONS(model_type='TENSORFLOW',model_path='gs://BUCKET_NAME/mobilenet/*');

    ReplaceBUCKET_NAME with the name of the bucketyou previously created.

Inspect the model

Inspect the uploaded model to see what its input and output fields are:

  1. Go to theBigQuery page.

    Go to BigQuery

  2. In the left pane, clickExplorer:

    Highlighted button for the Explorer pane.

  3. In theExplorer pane, expand your project, clickDatasets, and thenclick themobilenet_inference_test dataset.

  4. Go to theModels tab.

  5. Click themobilenet model.

  6. In the model pane that opens, click theSchema tab.

  7. Look at theLabels section. This identifies the fields that are outputby the model. In this case, the field name value isfeature_vector.

  8. Look at theFeatures section. This identifies the fields that mustbe input into the model. You reference them in theSELECT statementfor theML.DECODE_IMAGE function. In this case, the field name value isinputs.

Run inference

Run inference on thesample_images object table using themobilenet model:

  1. Go to theBigQuery page.

    Go to BigQuery

  2. In theEditor pane, run the following SQL statement:

    SELECT*FROMML.PREDICT(MODEL`mobilenet_inference_test.mobilenet`,(SELECTuri,ML.RESIZE_IMAGE(ML.DECODE_IMAGE(data),224,224,FALSE)ASinputsFROMmobilenet_inference_test.sample_images));

    The results should look similar to the following:

    --------------------------------------------------------------------------------------------------------------| feature_vector         | uri                                                        | inputs               |-------------------------------------------------------------------------------------------------------------| 0.850297749042511      | gs://mybucket/flowers/dandelion/3844111216_742ea491a0.jpg  | 0.29019609093666077  |-------------------------------------------------------------------------------------------------------------|-0.27427938580513      |                                                            | 0.31372550129890442  |-------------------------------------------------|-0.23189745843410492   |                                                            | 0.039215687662363052 |-------------------------------------------------|-0.058292809873819351  |                                                            | 0.29985997080802917  |-------------------------------------------------------------------------------------------------------------

Clean up

    Caution: Deleting a project has the following effects:
    • Everything in the project is deleted. If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.
    • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as anappspot.com URL, delete selected resources inside the project instead of deleting the whole project.

    If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects can help you avoid exceeding project quota limits.

  1. In the Google Cloud console, go to theManage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then clickDelete.
  3. In the dialog, type the project ID, and then clickShut down to delete the project.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.