Export feature values

Vertex AI Feature Store (Legacy) isdeprecated. Beginning on May 17, 2026, no new features will be added and only critical patches will be provided. On February 17, 2027, the service will be fully sunset and APIs will no longer be available.

For continued support and faster innovation, migrate toVertex AI Feature Store (V2), our integrated platform for machine learning (ML) feature management introduced on November 17, 2023.

Export feature values for all entities of a single entity type to aBigQuery table or a Cloud Storage bucket. You can choose to get asnapshot or to fully export feature values. A snapshot returns a single valueper feature compared to a full export, which can return multiple values perfeature. You can't select particular entity IDs or include multiple entitytypes when exporting feature values.

Exporting feature values is useful for archiving or for performing ad hocanalysis on your data. For example, you can store regular snapshots of yourfeaturestore to save its state at different points in time. If you need to getfeature values for building a training dataset, usebatchserving instead.

Snapshot and full export comparison

Both the snapshot and full export options let you query data by specifying asingle timestamp (either the start time or end time) or both timestamps. Forsnapshots, Vertex AI Feature Store (Legacy) returns the latest feature valuewithin a given time range. In the output, the associated timestamp with eachfeature value is the snapshot timestamp (not the feature value timestamp).

For full exports, Vertex AI Feature Store (Legacy) returns all feature valueswithin a given time range. In the output, the associated timestamp with eachfeature value is the feature timestamp (the specified timestamp when the featurevalue was ingested).

The following table summarizes what Vertex AI Feature Store (Legacy) returnsbased on the option that you choose and the timestamps that you provide.

OptionStart time only (inclusive)End time only (inclusive)Start and end time (inclusive)
SnapshotStarting with the current time (when the request was received), returns the latest value, looking back until the start time.
The snapshot timestamp is set to the current time.
Starting with the end time, returns the latest value, looking back to the very first value for each feature.
The snapshot timestamp is set to the specified end time.
Returns the latest value within the specified time range.
The snapshot timestamp is set to the specified end time.
Full exportReturns all values on and after the start time and up to the current time (when the request was sent).Returns all values up to the end time, going all the way back to the very first value for each feature.Returns all values within the specified time range.

Null values

For snapshots, if the latest feature value is null at a given timestamp,Vertex AI Feature Store (Legacy) returns the previous non-null feature value.If there are no previous non-null values, Vertex AI Feature Store (Legacy)returns null.

For full exports, if a feature value is null at a given timestamp,Vertex AI Feature Store (Legacy) returns null for that timestamp.

Examples

As an example, assume you had the following values in a featurestore, where thevalues forFeature_A andFeature_B share the same timestamp:

Entity IDFeature value timestampFeature_AFeature_B
123T1A_T1B_T1
123T2A_T2NULL
123T3A_T3NULL
123T4A_T4B_T4
123T5NULLB_T5

Snapshot

For snapshots, Vertex AI Feature Store (Legacy) returns the following valuesbased on the given timestamp values:

  • If only thestart time is set toT3, the snapshot returns the followingvalues:
Entity IDSnapshot timestampFeature_AFeature_B
123CURRENT_TIMEA_T4B_T5
  • If only theend time is set toT3, the snapshot returns the followingvalues:
Entity IDSnapshot timestampFeature_AFeature_B
123T3A_T3B_T1
  • If thestart andend times are set toT2 andT3, the snapshotreturns the following values:
Entity IDSnapshot timestampFeature_AFeature_B
123T3A_T3NULL

Full export

For full exports, Vertex AI Feature Store (Legacy) returns the followingvalues based on the given timestamp values:

  • If only thestart time is set toT3, the full export returns thefollowing values:
Entity IDFeature value timestampFeature_AFeature_B
123T3A_T3NULL
123T4A_T4B_T4
123T5NULLB_T5
  • If only theend time is set toT3, the full export returns the followingvalues:
Entity IDFeature value timestampFeature_AFeature_B
123T1A_T1B_T1
123T2A_T2NULL
123T3A_T3NULL
  • If thestart andend times are set toT2 andT4, the full exportreturns the following values:
Entity IDFeature value timestampFeature_AFeature_B
123T2A_T2NULL
123T3A_T3NULL
123T4A_T4B_T4

Export feature values

When you export feature values, you choose which features to query and whetherit is a snapshot or a full export. The following sections show a sample for eachoption.

For both options, the output destination must be in the same region as thesource featurestore. For example, if your featurestore is inus-central1,then the destination Cloud Storage bucket or BigQuerytable must also be inus-central1.

Snapshot

Export the latest feature values for a given time range.

Web UI

Use another method. You cannot export feature values from theGoogle Cloud console.

REST

To export feature values, send a POST request by using theentityTypes.exportFeatureValuesmethod.

The following sample outputs a BigQuery table, but you can also outputto a Cloud Storage bucket. Each output destination might have someprerequisites before you can submit a request. For example, if you specify atable name for thebigqueryDestination field, you must have anexisting dataset. These requirements are documented in the API reference.

Before using any of the request data, make the following replacements:

  • LOCATION_ID: Region where the featurestore is located. For example,us-central1.
  • PROJECT_ID: Yourproject ID.
  • FEATURESTORE_ID: ID of the featurestore.
  • ENTITY_TYPE_ID: ID of the entity type.
  • START_TIME andEND_TIME: (Optional) If you specify the start time only, returns the latest value starting from the current time (when the request is sent) and looking back until the start time. If you specify the end time only, returns the latest value starting from the end time (inclusive) and looking back to the very first value. If you specify a start time and end time, returns the latest value within the specified time range (inclusive). If you specify neither, returns the latest values for each feature, starting from the current time and looking back to the very first value.
  • DATASET_NAME: Name of the destination BigQuery dataset.
  • TABLE_NAME: Name of the destination BigQuery table.
  • FEATURE_ID: ID of one or more features. Specify a single* (asterisk) to select all features.

HTTP method and URL:

POST https://LOCATION_ID-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:exportFeatureValues

Request JSON body:

{  "snapshotExport": {    "start_time": "START_TIME",    "snapshot_time": "END_TIME"  },  "destination" : {    "bigqueryDestination": {      "outputUri": "bq://PROJECT_ID.DATASET_NAME.TABLE_NAME"    }  },  "featureSelector": {    "idMatcher": {      "ids": ["FEATURE_ID", ...]    }  }}

To send your request, choose one of these options:

curl

Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

Save the request body in a file namedrequest.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://LOCATION_ID-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:exportFeatureValues"

PowerShell

Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

Save the request body in a file namedrequest.json, and execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://LOCATION_ID-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:exportFeatureValues" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{  "name": "projects/PROJECT_NUMBER/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID/operations/OPERATION_ID",  "metadata": {    "@type": "type.googleapis.com/google.cloud.aiplatform.v1.ExportFeatureValuesOperationMetadata",    "genericMetadata": {      "createTime": "2021-12-03T22:55:25.974976Z",      "updateTime": "2021-12-03T22:55:25.974976Z"    }  }}

Java

Before trying this sample, follow theJava setup instructions in theVertex AI quickstart using client libraries. For more information, see theVertex AIJava API reference documentation.

To authenticate to Vertex AI, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importcom.google.api.gax.longrunning.OperationFuture;importcom.google.cloud.aiplatform.v1.BigQueryDestination;importcom.google.cloud.aiplatform.v1.EntityTypeName;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesOperationMetadata;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesRequest;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesRequest.SnapshotExport;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesResponse;importcom.google.cloud.aiplatform.v1.FeatureSelector;importcom.google.cloud.aiplatform.v1.FeatureValueDestination;importcom.google.cloud.aiplatform.v1.FeaturestoreServiceClient;importcom.google.cloud.aiplatform.v1.FeaturestoreServiceSettings;importcom.google.cloud.aiplatform.v1.IdMatcher;importjava.io.IOException;importjava.util.Arrays;importjava.util.List;importjava.util.concurrent.ExecutionException;importjava.util.concurrent.TimeUnit;importjava.util.concurrent.TimeoutException;publicclassExportFeatureValuesSnapshotSample{publicstaticvoidmain(String[]args)throwsIOException,InterruptedException,ExecutionException,TimeoutException{// TODO(developer): Replace these variables before running the sample.Stringproject="YOUR_PROJECT_ID";StringfeaturestoreId="YOUR_FEATURESTORE_ID";StringentityTypeId="YOUR_ENTITY_TYPE_ID";StringdestinationTableUri="YOUR_DESTINATION_TABLE_URI";List<String>featureSelectorIds=Arrays.asList("title","genres","average_rating");Stringlocation="us-central1";Stringendpoint="us-central1-aiplatform.googleapis.com:443";inttimeout=300;exportFeatureValuesSnapshotSample(project,featurestoreId,entityTypeId,destinationTableUri,featureSelectorIds,location,endpoint,timeout);}staticvoidexportFeatureValuesSnapshotSample(Stringproject,StringfeaturestoreId,StringentityTypeId,StringdestinationTableUri,List<String>featureSelectorIds,Stringlocation,Stringendpoint,inttimeout)throwsIOException,InterruptedException,ExecutionException,TimeoutException{FeaturestoreServiceSettingsfeaturestoreServiceSettings=FeaturestoreServiceSettings.newBuilder().setEndpoint(endpoint).build();// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(FeaturestoreServiceClientfeaturestoreServiceClient=FeaturestoreServiceClient.create(featurestoreServiceSettings)){FeatureSelectorfeatureSelector=FeatureSelector.newBuilder().setIdMatcher(IdMatcher.newBuilder().addAllIds(featureSelectorIds).build()).build();ExportFeatureValuesRequestexportFeatureValuesRequest=ExportFeatureValuesRequest.newBuilder().setEntityType(EntityTypeName.of(project,location,featurestoreId,entityTypeId).toString()).setDestination(FeatureValueDestination.newBuilder().setBigqueryDestination(BigQueryDestination.newBuilder().setOutputUri(destinationTableUri))).setFeatureSelector(featureSelector).setSnapshotExport(SnapshotExport.newBuilder()).build();OperationFuture<ExportFeatureValuesResponse,ExportFeatureValuesOperationMetadata>exportFeatureValuesFuture=featurestoreServiceClient.exportFeatureValuesAsync(exportFeatureValuesRequest);System.out.format("Operation name: %s%n",exportFeatureValuesFuture.getInitialFuture().get().getName());System.out.println("Waiting for operation to finish...");ExportFeatureValuesResponseexportFeatureValuesResponse=exportFeatureValuesFuture.get(timeout,TimeUnit.SECONDS);System.out.println("Snapshot Export Feature Values Response");System.out.println(exportFeatureValuesResponse);featurestoreServiceClient.close();}}}

Node.js

Before trying this sample, follow theNode.js setup instructions in theVertex AI quickstart using client libraries. For more information, see theVertex AINode.js API reference documentation.

To authenticate to Vertex AI, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

/** * TODO(developer): Uncomment these variables before running the sample.\ * (Not necessary if passing values as arguments) */// const project = 'YOUR_PROJECT_ID';// const featurestoreId = 'YOUR_FEATURESTORE_ID';// const entityTypeId = 'YOUR_ENTITY_TYPE_ID';// const destinationTableUri = 'YOUR_BQ_DESTINATION_TABLE_URI';// const timestamp = <STARTING_TIMESTAMP_OF_SNAPSHOT_IN_SECONDS>;// const location = 'YOUR_PROJECT_LOCATION';// const apiEndpoint = 'YOUR_API_ENDPOINT';// const timeout = <TIMEOUT_IN_MILLI_SECONDS>;// Imports the Google Cloud Featurestore Service Client libraryconst{FeaturestoreServiceClient}=require('@google-cloud/aiplatform').v1;// Specifies the location of the api endpointconstclientOptions={apiEndpoint:apiEndpoint,};// Instantiates a clientconstfeaturestoreServiceClient=newFeaturestoreServiceClient(clientOptions);asyncfunctionexportFeatureValuesSnapshot(){// Configure the entityType resourceconstentityType=`projects/${project}/locations/${location}/featurestores/${featurestoreId}/entityTypes/${entityTypeId}`;constdestination={bigqueryDestination:{// # Output to BigQuery table created earlieroutputUri:destinationTableUri,},};constfeatureSelector={idMatcher:{ids:['age','gender','liked_genres'],},};constsnapshotExport={startTime:{seconds:Number(timestamp),},};constrequest={entityType:entityType,destination:destination,featureSelector:featureSelector,snapshotExport:snapshotExport,};// Export Feature Values Requestconst[operation]=awaitfeaturestoreServiceClient.exportFeatureValues(request,{timeout:Number(timeout)});const[response]=awaitoperation.promise();console.log('Export feature values snapshot response');console.log('Raw response:');console.log(JSON.stringify(response,null,2));}exportFeatureValuesSnapshot();

Additional languages

To learn how to install and use the Vertex AI SDK for Python, seeUse the Vertex AI SDK for Python. For more information, see theVertex AI SDK for Python API reference documentation.

Full export

Export all feature values within a given time range.

Web UI

Use another method. You cannot export feature values from theGoogle Cloud console.

REST

To export feature values, send a POST request by using theentityTypes.exportFeatureValuesmethod.

The following sample outputs a BigQuery table, but you can also outputto a Cloud Storage bucket. Each output destination might have someprerequisites before you can submit a request. For example, if you specify atable name for thebigqueryDestination field, you must have anexisting dataset. These requirements are documented in the API reference.

Before using any of the request data, make the following replacements:

  • LOCATION_ID: Region where the featurestore is located. For example,us-central1.
  • PROJECT_ID: .
  • FEATURESTORE_ID: ID of the featurestore.
  • ENTITY_TYPE_ID: ID of the entity type.
  • START_TIME andEND_TIME: (Optional) If you specify the start time only, returns all values between the current time (when the request is sent) and the start time (inclusive). If you specify the end time only, returns all values between the end time (inclusive) and the very first value timestamp (for each feature). If you specify a start time and end time, returns all values within the specified time range (inclusive). If you specify neither, returns all values between the current time and the very first value timestamp (for each feature).
  • DATASET_NAME: Name of the destination BigQuery dataset.
  • TABLE_NAME: Name of the destination BigQuery table.
  • FEATURE_ID: ID of one or more features. Specify a single* (asterisk) to select all features.

HTTP method and URL:

POST https://LOCATION_ID-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:exportFeatureValues

Request JSON body:

{  "fullExport": {    "start_time": "START_TIME",    "end_time": "END_TIME"  },  "destination" : {    "bigqueryDestination": {      "outputUri": "bq://PROJECT.DATASET_NAME.TABLE_NAME"    }  },  "featureSelector": {    "idMatcher": {      "ids": ["FEATURE_ID", ...]    }  }}

To send your request, choose one of these options:

curl

Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

Save the request body in a file namedrequest.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://LOCATION_ID-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:exportFeatureValues"

PowerShell

Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

Save the request body in a file namedrequest.json, and execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://LOCATION_ID-aiplatform.googleapis.com/v1/projects/PROJECT/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID:exportFeatureValues" | Select-Object -Expand Content

You should receive a JSON response similar to the following:

{  "name": "projects/PROJECT_NUMBER/locations/LOCATION_ID/featurestores/FEATURESTORE_ID/entityTypes/ENTITY_TYPE_ID/operations/OPERATION_ID",  "metadata": {    "@type": "type.googleapis.com/google.cloud.aiplatform.v1.ExportFeatureValuesOperationMetadata",    "genericMetadata": {      "createTime": "2021-12-03T22:55:25.974976Z",      "updateTime": "2021-12-03T22:55:25.974976Z"    }  }}

Java

Before trying this sample, follow theJava setup instructions in theVertex AI quickstart using client libraries. For more information, see theVertex AIJava API reference documentation.

To authenticate to Vertex AI, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

importcom.google.api.gax.longrunning.OperationFuture;importcom.google.cloud.aiplatform.v1.BigQueryDestination;importcom.google.cloud.aiplatform.v1.EntityTypeName;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesOperationMetadata;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesRequest;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesRequest.FullExport;importcom.google.cloud.aiplatform.v1.ExportFeatureValuesResponse;importcom.google.cloud.aiplatform.v1.FeatureSelector;importcom.google.cloud.aiplatform.v1.FeatureValueDestination;importcom.google.cloud.aiplatform.v1.FeaturestoreServiceClient;importcom.google.cloud.aiplatform.v1.FeaturestoreServiceSettings;importcom.google.cloud.aiplatform.v1.IdMatcher;importjava.io.IOException;importjava.util.Arrays;importjava.util.List;importjava.util.concurrent.ExecutionException;importjava.util.concurrent.TimeUnit;importjava.util.concurrent.TimeoutException;publicclassExportFeatureValuesSample{publicstaticvoidmain(String[]args)throwsIOException,InterruptedException,ExecutionException,TimeoutException{// TODO(developer): Replace these variables before running the sample.Stringproject="YOUR_PROJECT_ID";StringfeaturestoreId="YOUR_FEATURESTORE_ID";StringentityTypeId="YOUR_ENTITY_TYPE_ID";StringdestinationTableUri="YOUR_DESTINATION_TABLE_URI";List<String>featureSelectorIds=Arrays.asList("title","genres","average_rating");Stringlocation="us-central1";Stringendpoint="us-central1-aiplatform.googleapis.com:443";inttimeout=300;exportFeatureValuesSample(project,featurestoreId,entityTypeId,destinationTableUri,featureSelectorIds,location,endpoint,timeout);}staticvoidexportFeatureValuesSample(Stringproject,StringfeaturestoreId,StringentityTypeId,StringdestinationTableUri,List<String>featureSelectorIds,Stringlocation,Stringendpoint,inttimeout)throwsIOException,InterruptedException,ExecutionException,TimeoutException{FeaturestoreServiceSettingsfeaturestoreServiceSettings=FeaturestoreServiceSettings.newBuilder().setEndpoint(endpoint).build();// Initialize client that will be used to send requests. This client only needs to be created// once, and can be reused for multiple requests. After completing all of your requests, call// the "close" method on the client to safely clean up any remaining background resources.try(FeaturestoreServiceClientfeaturestoreServiceClient=FeaturestoreServiceClient.create(featurestoreServiceSettings)){FeatureSelectorfeatureSelector=FeatureSelector.newBuilder().setIdMatcher(IdMatcher.newBuilder().addAllIds(featureSelectorIds).build()).build();ExportFeatureValuesRequestexportFeatureValuesRequest=ExportFeatureValuesRequest.newBuilder().setEntityType(EntityTypeName.of(project,location,featurestoreId,entityTypeId).toString()).setDestination(FeatureValueDestination.newBuilder().setBigqueryDestination(BigQueryDestination.newBuilder().setOutputUri(destinationTableUri))).setFeatureSelector(featureSelector).setFullExport(FullExport.newBuilder()).build();OperationFuture<ExportFeatureValuesResponse,ExportFeatureValuesOperationMetadata>exportFeatureValuesFuture=featurestoreServiceClient.exportFeatureValuesAsync(exportFeatureValuesRequest);System.out.format("Operation name: %s%n",exportFeatureValuesFuture.getInitialFuture().get().getName());System.out.println("Waiting for operation to finish...");ExportFeatureValuesResponseexportFeatureValuesResponse=exportFeatureValuesFuture.get(timeout,TimeUnit.SECONDS);System.out.println("Export Feature Values Response");System.out.println(exportFeatureValuesResponse);featurestoreServiceClient.close();}}}

Node.js

Before trying this sample, follow theNode.js setup instructions in theVertex AI quickstart using client libraries. For more information, see theVertex AINode.js API reference documentation.

To authenticate to Vertex AI, set up Application Default Credentials. For more information, seeSet up authentication for a local development environment.

/** * TODO(developer): Uncomment these variables before running the sample.\ * (Not necessary if passing values as arguments) */// const project = 'YOUR_PROJECT_ID';// const featurestoreId = 'YOUR_FEATURESTORE_ID';// const entityTypeId = 'YOUR_ENTITY_TYPE_ID';// const destinationTableUri = 'YOUR_BQ_DESTINATION_TABLE_URI';// const location = 'YOUR_PROJECT_LOCATION';// const apiEndpoint = 'YOUR_API_ENDPOINT';// const timeout = <TIMEOUT_IN_MILLI_SECONDS>;// Imports the Google Cloud Featurestore Service Client libraryconst{FeaturestoreServiceClient}=require('@google-cloud/aiplatform').v1;// Specifies the location of the api endpointconstclientOptions={apiEndpoint:apiEndpoint,};// Instantiates a clientconstfeaturestoreServiceClient=newFeaturestoreServiceClient(clientOptions);asyncfunctionexportFeatureValues(){// Configure the entityType resourceconstentityType=`projects/${project}/locations/${location}/featurestores/${featurestoreId}/entityTypes/${entityTypeId}`;constdestination={bigqueryDestination:{// # Output to BigQuery table created earlieroutputUri:destinationTableUri,},};constfeatureSelector={idMatcher:{ids:['age','gender','liked_genres'],},};constrequest={entityType:entityType,destination:destination,featureSelector:featureSelector,fullExport:{},};// Export Feature Values Requestconst[operation]=awaitfeaturestoreServiceClient.exportFeatureValues(request,{timeout:Number(timeout)});const[response]=awaitoperation.promise();console.log('Export feature values response');console.log('Raw response:');console.log(JSON.stringify(response,null,2));}exportFeatureValues();

Additional languages

To learn how to install and use the Vertex AI SDK for Python, seeUse the Vertex AI SDK for Python. For more information, see theVertex AI SDK for Python API reference documentation.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-18 UTC.