Create, upload, and use a pipeline template

A pipeline template is a resource that you can use to publish a workflow definitionso that it can be reused multiple times, by a single user or bymultiple users.

The Kubeflow Pipelines SDK registry client is a new client interface that you can usewith a compatible registry server, such as Artifact Registry, for version control ofyour Kubeflow Pipelines (KFP) templates. For more information, seeUse the template in a Kubeflow Pipelines SDK registry client.

This page shows you how to:

  • Create a KFP pipeline template
  • Use the Kubeflow Pipelines SDK registry client to upload the template to a pipeline template repository
  • Use the template in the Kubeflow Pipelines client

Before you begin

Before you build and run your pipeline, use the following instructions to setup your Google Cloud project and development environment in the Google Cloud console.

  1. Install v2 or later of the Kubeflow Pipelines SDK.

    pipinstall--upgrade"kfp>=2,<3"
Note: To upgrade to the latest version of the Kubeflow Pipelines SDK, run the following command:
pip install kfp --upgrade
If an updated version is available, running this command uninstalls KFP and installs the latest version.
  1. Install v1.15.0 or later of theVertex AI SDK for Python.
    (Optional) Before installing, run the following command to see which version of theVertex AI SDK for Python is installed:

      pip freeze | grep google-cloud-aiplatform
    Note: To upgrade to the latest version of theVertex AI SDK for Python, run the following command:
    pip install google-cloud-aiplatform --upgrade
    If an updated version is available, running this command uninstalls your installed version and installs the latest version.
  2. (Optional) Install 390.0.0 or later of theGoogle Cloud CLI.

  3. Enable the Artifact Registry API.

Configuring permissions

If you have not already set up your gcloud CLI project for Vertex AI Pipelines,follow the instructions inConfigure your Google Cloud project for Vertex AI Pipelines.

Additionally, assign the following predefined Identity and Access Management permissions to use Artifact Registry as the template registry:

  • roles/artifactregistry.admin: Assign this role to create and manage a repository.
  • roles/artifactregistry.repoAdmin orroles/artifactregistry.writer: Assign any of these roles to manage templates inside a repository.
  • roles/artifactregistry.reader: Assign this role to download templates from a repository.
  • roles/artifactregistry.reader: Assign this role to aservice account associated with Vertex AI Pipelines to create a pipeline run from a template.

For more information about predefined Identity and Access Management roles for Artifact Registry, seePredefined Artifact Registry roles.

Use the following sample to assign roles:

gcloud projects add-iam-policy-bindingPROJECT_ID \    --member=PRINCIPAL \    --role=ROLE

Replace the following:

  • PROJECT_ID: The project where you want to create the pipeline.
  • PRINCIPAL: The principal to which you are adding permissions.
  • ROLE: The Identity and Access Management role that you want to grant to the principal.

SeeRoles and permissions in theArtifact Registry documentation for more information about the following:

Create a repository in Artifact Registry

Next you'll create a repository in Artifact Registry for your pipeline templates.

Console

  1. OpenVertex AI Pipelines in the Google Cloud console.

    Go toVertex AI Pipelines

  2. Click theYour templates tab.

  3. To open theSelect repository pane, clickSelect repository.

  4. ClickCreate repository.

  5. Specifyquickstart-kfp-repo as the repository name.

  6. UnderFormat, selectKubeflow Pipelines.

  7. UnderLocation Type, selectRegion.

  8. In theRegion drop-down list, selectus-central1.

    Note: The example commands and configuration in these instructions useus-central1, but you can use any of the supported regions.
  9. ClickCreate.

Google Cloud CLI

Run the following command to create a repository.

Before using any of the command data below, make the following replacements:

Execute thegcloud artifacts repositories create command:

Linux, macOS, or Cloud Shell

Note: Ensure you have initialized the Google Cloud CLI with authentication and a project by running eithergcloud init; orgcloud auth login andgcloud config set project.
gcloudartifactsrepositoriescreatequickstart-kfp-repo--location=LOCATION_ID--repository-format=KFP

Windows (PowerShell)

Note: Ensure you have initialized the Google Cloud CLI with authentication and a project by running eithergcloud init; orgcloud auth login andgcloud config set project.
gcloudartifactsrepositoriescreatequickstart-kfp-repo--location=LOCATION_ID--repository-format=KFP

Windows (cmd.exe)

Note: Ensure you have initialized the Google Cloud CLI with authentication and a project by running eithergcloud init; orgcloud auth login andgcloud config set project.
gcloudartifactsrepositoriescreatequickstart-kfp-repo--location=LOCATION_ID--repository-format=KFP
 

Create a template

Use the following code sample to define a pipeline with a single component.For information about how to define a pipeline using KFP, seeBuild a Pipeline.

fromkfpimportdslfromkfpimportcompiler@dsl.component()defhello_world(text:str)->str:print(text)returntext@dsl.pipeline(name='hello-world',description='A simple intro pipeline')defpipeline_hello_world(text:str='hi there'):"""Pipeline that passes small pipeline parameter string to consumer op."""consume_task=hello_world(text=text)# Passing pipeline parameter as argument to consumer opcompiler.Compiler().compile(pipeline_func=pipeline_hello_world,package_path='hello_world_pipeline.yaml')

When you run the sample, thecompiler.Compiler().compile(...) statementcompiles the "hello-world" pipeline into the local YAML file namedhello_world_pipeline.yaml.

Upload the template

Console

  1. OpenVertex AI Pipelines in the Google Cloud console.

    Go toVertex AI Pipelines

  2. ClickUpload to open theUpload pipeline or component pane.

  3. In theRepository drop-down list, select thequickstart-kfp-repo repository.

  4. Specify aName for the pipeline template.

  5. In theFile field, clickChoose to select and upload the compiled pipeline template YAML from your local file system.

  6. After you upload the pipeline template, it is listed on theYour templates page.

    Go toYour templates

Kubeflow Pipelines SDK client

  1. To configure your Kubeflow Pipelines SDK registry client, run the following commands:

    fromkfp.registryimportRegistryClientclient=RegistryClient(host=f"https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo")
  2. Upload the compiled YAML file to your repository in Artifact Registry.

    templateName,versionName=client.upload_pipeline(file_name="hello_world_pipeline.yaml",tags=["v1","latest"],extra_headers={"description":"This is an example pipeline template."})
  3. To verify that the template was uploaded:

    1. OpenVertex AI Pipelines in the Google Cloud console.

      Go toVertex AI Pipelines

    2. Click theYour templates tab.

    3. ClickSelect repository.

    4. From the list, select thequickstart-kfp-repo repository, and then clickSelect.

    5. You should find the uploaded template packagehello-world from the list.

    6. To view list of versions of the pipeline template, click thehello-world template.

    7. To view the pipeline topology, click the version.

Use the template in Vertex AI

After you've uploaded your pipeline template to your repository in Artifact Registry,it is ready to be used in Vertex AI Pipelines.

Create a staging bucket for your template

Before you can use your pipeline template, you'll need to create a Cloud Storagebucket for staging pipeline runs.

To create the bucket, follow the instructions inConfigure a Cloud Storage bucket for pipeline artifactsand then run the following command:

STAGING_BUCKET="gs://BUCKET_NAME"

ReplaceBUCKET_NAME with the name of the bucket you just created.

Create a pipeline run from your template

You can use the Vertex AI SDK for Python or the Google Cloud console to create a pipelinerun from your template in Artifact Registry.

Console

  1. OpenVertex AI Pipelines in the Google Cloud console.

    Go toVertex AI Pipelines

  2. Click theYour templates tab.

  3. To open theSelect repository pane, clickSelect repository.

  4. Select thequickstart-kfp-repo repository, and then clickSelect.

  5. Click thehello-world package.

  6. Next to the4f245e8f9605 version, clickCreate Run.

  7. ClickRuntime Configuration.

  8. Enter the following underCloud Storage location:

    gs://BUCKET_NAME

    ReplaceBUCKET_NAME with the name of the bucket you created for staging your pipeline runs.

  9. ClickSubmit.

Vertex AI SDK for Python

Use the following sample to create a pipeline run from your template:

fromgoogle.cloudimportaiplatform# Initialize the aiplatform packageaiplatform.init(project="PROJECT_ID",location='us-central1',staging_bucket=STAGING_BUCKET)# Create a pipeline job using a version ID.job=aiplatform.PipelineJob(display_name="hello-world-latest",template_path="https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo/hello-world@SHA256_TAG"+ \versionName)# Alternatively, create a pipeline job using a tag.job=aiplatform.PipelineJob(display_name="hello-world-latest",template_path="https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo/hello-world/TAG")job.submit()

Replace the following:

  • PROJECT_ID: The Google Cloud project that this pipeline runs in.

  • SHA256_TAG: The sha256 hash value of the template version.

  • TAG: The version tag of the template.

View created pipeline runs

You can view the runs created by a specific pipeline version in the Vertex AI SDK for Python.

Console

  1. OpenVertex AI Pipelines in the Google Cloud console.

    Go toVertex AI Pipelines

  2. Click theYour templates tab.

  3. ClickSelect repository.

  4. From the list, select thequickstart-kfp-repo repository, and then clickSelect.

  5. To view list of versions for thehello-world pipeline template, click thehello world template.

  6. Click the desired version for which you want to view pipeline runs.

  7. To view pipeline runs for the selected version, clickView Runs and then click theRuns tab.

Vertex AI SDK for Python

To list the pipelines runs, run thepipelineJobs.list command as shown in one or more of the following examples:

fromgoogle.cloudimportaiplatform# To filter all runs created from a specific versionfilter='template_uri:"https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo/hello-world/*" AND '+ \'template_metadata.version="%s"'%versionNameaiplatform.PipelineJob.list(filter=filter)# To filter all runs created from a specific version tagfilter='template_uri="https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo/hello-world/latest"'aiplatform.PipelineJob.list(filter=filter)# To filter all runs created from a packagefilter='template_uri:"https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo/hello-world/*"'aiplatform.PipelineJob.list(filter=filter)# To filter all runs created from a repofilter='template_uri:"https://us-central1-kfp.pkg.dev/PROJECT_ID/quickstart-kfp-repo/*"'aiplatform.PipelineJob.list(filter=filter)

Use the template in a Kubeflow Pipelines SDK registry client

You can use a Kubeflow Pipelines SDK registry client together with Artifact Registry todownload and use your pipeline template.

  • To list the resources in the repository, run the following commands:

    templatePackages=client.list_packages()templatePackage=client.get_package(package_name="hello-world")versions=client.list_versions(package_name="hello-world")version=client.get_version(package_name="hello-world",version=versionName)tags=client.list_tags(package_name="hello-world")tag=client.get_tag(package_name="hello-world",tag="latest")

    For the complete list of available methods and documents, see theprotofiles in the Artifact RegistryGitHub repo.

  • To download the template to your local file system, run the following commands:

    # Sample 1filename=client.download_pipeline(package_name="hello-world",version=versionName)# Sample 2filename=client.download_pipeline(package_name="hello-world",tag="v1")# Sample 3filename=client.download_pipeline(package_name="hello-world",tag="v1",file_name="hello-world-template.yaml")

Use the Artifact Registry REST API

The following sections summarize how to use theArtifact Registry REST API to manageyour pipeline templates in your Artifact Registry repository.

Upload a pipeline template using the Artifact Registry REST API

You can upload a pipeline template by creating an HTTP request using the parametervalues described in this section, where:

  • PROJECT_ID is the Google Cloud project that this pipeline runs in.
  • REPO_ID is the ID of your Artifact Registry repository.

Example curl request

curl-XPOST\-H"Authorization: Bearer$(gcloudauthprint-access-token)"\-Ftags=v1,latest\-Fcontent=@pipeline_spec.yaml\https://us-central1-kfp.pkg.dev/PROJECT_ID/REPO_ID

Constructing the upload request

The request is an HTTP or HTTPS multipart request. It must include theauthentication token in the request header. For more information, seegcloud auth print-access-token.

The payload of the request is the contents of thepipeline_spec.yaml file(or .zip package). The recommended size limit is 10 MiB.

The package name is taken from thepipeline_spec.pipeline_info.name entry inthepipeline_spec.yaml file. The package name uniquely identifies the packageand is immutable across versions. It can be between 4 and 128 characters longand must match the following regular expression:^[a-z0-9][a-z0-9-]{3,127}$.

The packagetags are a list of up to eight comma-separated tags.Each tag must match the following regular expression:^[a-zA-Z0-9\-._~:@+]{1,128}$.

If a tag exists and points to a pipeline that's already been uploaded, the tag isupdated to point to the pipeline that you're uploading. For example,if thelatest tag points to a pipeline you've already uploaded, and you uploada new version with--tag=latest, thelatest tag is removed from thepreviously uploaded pipeline and assigned to the new pipeline you're uploading.

If the pipeline you're uploading is identical to a pipeline you've previously uploaded,the upload succeeds. The uploaded pipeline's metadata, including its version tags, isupdated to match the parameter values of your upload request.

Upload response

If the upload request succeeds, it returns anHTTP OK status. The body of theresponse is as follows:

{packageName}/{versionName=sha256:abcdef123456...}

whereversionName is the sha256 digest ofpipeline_spec.yaml formatted as ahex string.

Download a pipeline template using the Artifact Registry REST API

You can download a pipeline template by creating an HTTP request using the parametervalues described in this section, where:

  • PROJECT_ID is the Google Cloud project that this pipeline runs in.
  • REPO_ID is the ID of your Artifact Registry repository.
  • PACKAGE_ID is the package ID of your uploaded template.
  • TAG is the version tag.
  • VERSION is the template version in the format ofsha256:abcdef123456....

For standard Artifact Registry download, you should form the download link as follows:

url = https://us-central1-kfp.pkg.dev/PROJECT_ID/REPO_ID/PACKAGE_ID/VERSIONurl = https://us-central1-kfp.pkg.dev/PROJECT_ID/REPO_ID/PACKAGE_ID/TAG

Example curl requests

curl-XGET\-H"Authorization: Bearer$(gcloudauthprint-access-token)"\https://us-central1-kfp.pkg.dev/PROJECT_ID/REPO_ID/PACKAGE_ID/VERSION

You can replaceVERSION withTAG and download the same template,as shown in the following example:

curl-XGET\-H"Authorization: Bearer$(gcloudauthprint-access-token)"\https://us-central1-kfp.pkg.dev/PROJECT_ID/REPO_ID/PACKAGE_ID/TAG

Download response

If the download request succeeds, it returns anHTTP OK status. The body of theresponse is the contents of thepipeline_spec.yaml file.

Reference links

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.