Use continuous tuning for Gemini models

Continuous tuning lets you continue tuning an already tuned model ormodel checkpointby adding more epochs or training examples. Using an already tuned model orcheckpoint as a base model allows for more efficient tuning experimentation.

You can use continuous tuning for the following purposes:

  • To tune with more data if an existing tuned model isunderfitting.
  • To boost performance or keep the model up to date with new data.
  • To further customize an existing tuned model.

The following Gemini models support continuous tuning:

For detailed information about Gemini model versions, seeGoogle models andModel versions and lifecycle.

Configure continuous tuning

When creating a continuous tuning job, note the following:

  • Continuous tuning is supported in the Google Gen AI SDK. It isn'tsupported in the Vertex AI SDK for Python.
  • You must provide a model resource name:

    • In the Google Cloud console, the model resource name appears in theVertex AI Tuning page, in theTuning details > Model Namefield.
    • The model resource name uses the following format:
    projects/{project}/locations/{location}/models/{modelId}@{version_id}
    • {version_id} is optional and can be either the generated version ID or auser-provided version alias. If it is omitted, the default version is used.
  • If you don't specify a model version, the default version is used.

  • If you're using a checkpoint as a base model and don't specify a checkpointID, the default checkpoint is used. For more information, seeUse checkpoints in supervised fine-tuning for Gemini models.In the Google Cloud console, the default checkpoint can be found as follows:

    1. Go to the Model Registry page.
    2. Click theModel Name for the model.
    3. ClickView all versions.
    4. Click the desired version to view a list of checkpoints. The defaultcheckpoint is indicated by the worddefault next to the checkpoint ID.
  • By default, a new model version is created under the same parent model as thepre-tuned model. If you supply a new tuned model display name, a new model iscreated.

  • Only supervised tuning base models that are tuned on or after July 11, 2025can be used as base models for continuous tuning.

  • If you're usingcustomer-managed encryption keys (CMEK),your continuous tuning job must use the same CMEK that was used in the tuningjob for the pre-tuned model.

Console

To configure continuous tuning for a pre-tuned model by using theGoogle Cloud console, perform the following steps:

  1. In the Vertex AI section of the Google Cloud console, go to theVertex AI Studio page.

    Go toVertex AI Studio

  2. ClickCreate tuned model.

  3. UnderModel details, configure the following:

    1. ChooseTune a pre-tuned model.
    2. In thePre-tuned model field, choose the name of your pre-tuned model.
    3. If the model has at least one checkpoint, theCheckpoint drop-downfield appears. Choose the desired checkpoint.
  4. ClickContinue.

REST

To to configure continuous tuning, send a POST request by using thetuningJobs.createmethod. Some of the parameters are not supported by all of the models. Ensurethat you include only the applicable parameters for the model that you'retuning.

Before using any of the request data, make the following replacements:

  • Parameters for continuous tuning:
    • TUNED_MODEL_NAME: Name of the tuned model to use.
    • CHECKPOINT_IDOptional: ID of the checkpoint to use.
  • The remaining parameters are the same as forsupervised fine tuning orpreference tuning.

HTTP method and URL:

POST https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs

Request JSON body:

{  "preTunedModel": {      "tunedModelName": "TUNED_MODEL_NAME",      "checkpointId": "CHECKPOINT_ID",  },  "supervisedTuningSpec" : {      "trainingDatasetUri": "TRAINING_DATASET_URI",      "validationDatasetUri": "VALIDATION_DATASET_URI",      "hyperParameters": {          "epochCount":EPOCH_COUNT,          "adapterSize": "ADAPTER_SIZE",          "learningRateMultiplier": "LEARNING_RATE_MULTIPLIER"      },      "exportLastCheckpointOnly":EXPORT_LAST_CHECKPOINT_ONLY,      "evaluationConfig": {          "metrics": [              {                  "aggregation_metrics": ["AVERAGE", "STANDARD_DEVIATION"],                  "METRIC_SPEC": {                      "METRIC_SPEC_FIELD_NAME":METRIC_SPEC_FIELD_CONTENT                  }              },          ],          "outputConfig": {              "gcs_destination": {                  "output_uri_prefix": "CLOUD_STORAGE_BUCKET"              }          },      },  },  "tunedModelDisplayName": "TUNED_MODEL_DISPLAYNAME",  "encryptionSpec": {    "kmsKeyName": "KMS_KEY_NAME"  },  "serviceAccount": "SERVICE_ACCOUNT"}

To send your request, choose one of these options:

curl

Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login , or by usingCloud Shell, which automatically logs you into thegcloud CLI . You can check the currently active account by runninggcloud auth list.

Save the request body in a file namedrequest.json, and execute the following command:

curl -X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json; charset=utf-8" \
-d @request.json \
"https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs"

PowerShell

Note: The following command assumes that you have logged in to thegcloud CLI with your user account by runninggcloud init orgcloud auth login . You can check the currently active account by runninggcloud auth list.

Save the request body in a file namedrequest.json, and execute the following command:

$cred = gcloud auth print-access-token
$headers = @{ "Authorization" = "Bearer $cred" }

Invoke-WebRequest `
-Method POST `
-Headers $headers `
-ContentType: "application/json; charset=utf-8" `
-InFile request.json `
-Uri "https://TUNING_JOB_REGION-aiplatform.googleapis.com/v1/projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs" | Select-Object -Expand Content

You should receive a JSON response similar to the following.

Response

{  "name": "projects/PROJECT_ID/locations/TUNING_JOB_REGION/tuningJobs/TUNING_JOB_ID",  "createTime":CREATE_TIME,  "updateTime":UPDATE_TIME,  "status": "STATUS",  "supervisedTuningSpec": {        "trainingDatasetUri": "TRAINING_DATASET_URI",        "validationDatasetUri": "VALIDATION_DATASET_URI",        "hyperParameters": {            "epochCount":EPOCH_COUNT,            "adapterSize": "ADAPTER_SIZE",            "learningRateMultiplier":LEARNING_RATE_MULTIPLIER        },    },  "tunedModelDisplayName": "TUNED_MODEL_DISPLAYNAME",  "encryptionSpec": {    "kmsKeyName": "KMS_KEY_NAME"  },  "serviceAccount": "SERVICE_ACCOUNT"}

Example curl command

PROJECT_ID=myprojectLOCATION=globalcurl\-XPOST\-H"Authorization: Bearer$(gcloudauthprint-access-token)"\-H"Content-Type: application/json; charset=utf-8"\"https://${LOCATION}-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/${LOCATION}/tuningJobs"\-d\$'{   "preTunedModel": "gemini-2.5-flash",   "supervisedTuningSpec" : {      "trainingDatasetUri": "gs://cloud-samples-data/ai-platform/generative_ai/gemini/text/sft_train_data.jsonl",      "validationDatasetUri": "gs://cloud-samples-data/ai-platform/generative_ai/gemini/text/sft_validation_data.jsonl"   },   "tunedModelDisplayName": "tuned_gemini"}'

Google Gen AI SDK

The following example shows how to configure continuous tuning by using theGoogle Gen AI SDK.

importtimefromgoogleimportgenaifromgoogle.genai.typesimportHttpOptions,TuningDataset,CreateTuningJobConfig# TODO(developer): Update and un-comment below line# tuned_model_name = "projects/123456789012/locations/us-central1/models/1234567890@1"# checkpoint_id = "1"client=genai.Client(http_options=HttpOptions(api_version="v1beta1"))training_dataset=TuningDataset(gcs_uri="gs://cloud-samples-data/ai-platform/generative_ai/gemini/text/sft_train_data.jsonl",)validation_dataset=TuningDataset(gcs_uri="gs://cloud-samples-data/ai-platform/generative_ai/gemini/text/sft_validation_data.jsonl",)tuning_job=client.tunings.tune(base_model=tuned_model_name,# Note: Using a Tuned Modeltraining_dataset=training_dataset,config=CreateTuningJobConfig(tuned_model_display_name="Example tuning job",validation_dataset=validation_dataset,pre_tuned_model_checkpoint_id=checkpoint_id,),)running_states=set(["JOB_STATE_PENDING","JOB_STATE_RUNNING",])whiletuning_job.stateinrunning_states:print(tuning_job.state)tuning_job=client.tunings.get(name=tuning_job.name)time.sleep(60)print(tuning_job.tuned_model.model)print(tuning_job.tuned_model.endpoint)print(tuning_job.experiment)# Example response:# projects/123456789012/locations/us-central1/models/1234567890@2# projects/123456789012/locations/us-central1/endpoints/123456789012345# projects/123456789012/locations/us-central1/metadataStores/default/contexts/tuning-experiment-2025010112345678iftuning_job.tuned_model.checkpoints:fori,checkpointinenumerate(tuning_job.tuned_model.checkpoints):print(f"Checkpoint{i+1}: ",checkpoint)# Example response:# Checkpoint 1:  checkpoint_id='1' epoch=1 step=10 endpoint='projects/123456789012/locations/us-central1/endpoints/123456789000000'# Checkpoint 2:  checkpoint_id='2' epoch=2 step=20 endpoint='projects/123456789012/locations/us-central1/endpoints/123456789012345'

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.