REST Resource: projects.locations.pipelineJobs Stay organized with collections Save and categorize content based on your preferences.
Resource: PipelineJob
An instance of a machine learning PipelineJob.
namestringOutput only. The resource name of the PipelineJob.
displayNamestringThe display name of the Pipeline. The name can be up to 128 characters long and can consist of any UTF-8 characters.
createTimestring (Timestamp format)Output only. Pipeline creation time.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
startTimestring (Timestamp format)Output only. Pipeline start time.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
endTimestring (Timestamp format)Output only. Pipeline end time.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
updateTimestring (Timestamp format)Output only. timestamp when this PipelineJob was most recently updated.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
pipelineSpecobject (Struct format)A compiled definition of a pipeline, represented as aJSON object. Defines the structure of the pipeline, including its components, tasks, and parameters. This specification is generated by compiling a pipeline function defined inPython using theKubeflow Pipelines SDK.
stateenum (PipelineState)Output only. The detailed state of the job.
jobDetailobject (PipelineJobDetail)Output only. The details of pipeline run. Not available in the list view.
errorobject (Status)Output only. The error that occurred during pipeline execution. Only populated when the pipeline's state is FAILED or CANCELLED.
labelsmap (key: string, value: string)The labels with user-defined metadata to organize PipelineJob.
label keys and values can be no longer than 64 characters (Unicode codepoints), can only contain lowercase letters, numeric characters, underscores and dashes. International characters are allowed.
Seehttps://goo.gl/xmQnxf for more information and examples of labels.
Note there is some reserved label key for Vertex AI Pipelines. -vertex-ai-pipelines-run-billing-id, user set value will get overrided.
runtimeConfigobject (RuntimeConfig)Runtime config of the pipeline.
encryptionSpecobject (EncryptionSpec)Customer-managed encryption key spec for a pipelineJob. If set, this PipelineJob and all of its sub-resources will be secured by this key.
serviceAccountstringThe service account that the pipeline workload runs as. If not specified, the Compute Engine default service account in the project will be used. Seehttps://cloud.google.com/compute/docs/access/service-accounts#default_service_account
Users starting the pipeline must have theiam.serviceAccounts.actAs permission on this service account.
networkstringThe full name of the Compute Enginenetwork to which the Pipeline Job's workload should be peered. For example,projects/12345/global/networks/myVPC.Format is of the formprojects/{project}/global/networks/{network}. Where {project} is a project number, as in12345, and {network} is a network name.
Private services access must already be configured for the network. Pipeline job will apply the network configuration to the Google Cloud resources being launched, if applied, such as Vertex AI Training or Dataflow job. If left unspecified, the workload is not peered with any network.
reservedIpRanges[]stringA list of names for the reserved ip ranges under the VPC network that can be used for this Pipeline Job's workload.
If set, we will deploy the Pipeline Job's workload within the provided ip ranges. Otherwise, the job will be deployed to any ip ranges under the provided VPC network.
Example: ['vertex-ai-ip-range'].
pscInterfaceConfigobject (PscInterfaceConfig)Optional. Configuration for PSC-I for PipelineJob.
templateUristringA template uri from where thePipelineJob.pipeline_spec, if empty, will be downloaded. Currently, only uri from Vertex Template Registry & Gallery is supported. Reference tohttps://cloud.google.com/vertex-ai/docs/pipelines/create-pipeline-template.
templateMetadataobject (PipelineTemplateMetadata)Output only. Pipeline template metadata. Will fill up fields ifPipelineJob.template_uri is from supported template registry.
scheduleNamestringOutput only. The schedule resource name. Only returned if the Pipeline is created by Schedule API.
preflightValidationsbooleanOptional. Whether to do component level validations before job creation.
| JSON representation |
|---|
{"name":string,"displayName":string,"createTime":string,"startTime":string,"endTime":string,"updateTime":string,"pipelineSpec":{object},"state":enum ( |
PipelineJobDetail
The runtime detail of PipelineJob.
pipelineContextobject (Context)Output only. The context of the pipeline.
pipelineRunContextobject (Context)Output only. The context of the current pipeline run.
taskDetails[]object (PipelineTaskDetail)Output only. The runtime details of the tasks under the pipeline.
| JSON representation |
|---|
{"pipelineContext":{object ( |
PipelineTaskDetail
The runtime detail of a task execution.
taskIdstring (int64 format)Output only. The system generated id of the task.
parentTaskIdstring (int64 format)Output only. The id of the parent task if the task is within a component scope. Empty if the task is at the root level.
taskNamestringOutput only. The user specified name of the task that is defined inpipelineSpec.
createTimestring (Timestamp format)Output only. Task create time.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
startTimestring (Timestamp format)Output only. Task start time.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
endTimestring (Timestamp format)Output only. Task end time.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
executorDetailobject (PipelineTaskExecutorDetail)Output only. The detailed execution info.
stateenum (State)Output only. state of the task.
executionobject (Execution)Output only. The execution metadata of the task.
errorobject (Status)Output only. The error that occurred during task execution. Only populated when the task's state is FAILED or CANCELLED.
pipelineTaskStatus[]object (PipelineTaskStatus)Output only. A list of task status. This field keeps a record of task status evolving over time.
inputsmap (key: string, value: object (ArtifactList))Output only. The runtime input artifacts of the task.
outputsmap (key: string, value: object (ArtifactList))Output only. The runtime output artifacts of the task.
taskUniqueNamestringOutput only. The unique name of a task. This field is used by rerun pipeline job. Console UI and Vertex AI SDK will support triggering pipeline job reruns. The name is constructed by concatenating all the parent tasks name with the task name. For example, if a task named "child_task" has a parent task named "parent_task_1" and parent task 1 has a parent task named "parent_task_2", the task unique name will be "parent_task_2.parent_task_1.child_task".
| JSON representation |
|---|
{"taskId":string,"parentTaskId":string,"taskName":string,"createTime":string,"startTime":string,"endTime":string,"executorDetail":{object ( |
PipelineTaskExecutorDetail
The runtime detail of a pipeline executor.
detailsUnion typedetails can be only one of the following:containerDetailobject (ContainerDetail)Output only. The detailed info for a container executor.
customJobDetailobject (CustomJobDetail)Output only. The detailed info for a custom job executor.
| JSON representation |
|---|
{// details"containerDetail":{object ( |
ContainerDetail
The detail of a container execution. It contains the job names of the lifecycle of a container execution.
mainJobstringOutput only. The name of theCustomJob for the main container execution.
preCachingCheckJobstringOutput only. The name of theCustomJob for the pre-caching-check container execution. This job will be available if thePipelineJob.pipeline_spec specifies thepre_caching_check hook in the lifecycle events.
failedMainJobs[]stringOutput only. The names of the previously failedCustomJob for the main container executions. The list includes the all attempts in chronological order.
failedPreCachingCheckJobs[]stringOutput only. The names of the previously failedCustomJob for the pre-caching-check container executions. This job will be available if thePipelineJob.pipeline_spec specifies thepre_caching_check hook in the lifecycle events. The list includes the all attempts in chronological order.
| JSON representation |
|---|
{"mainJob":string,"preCachingCheckJob":string,"failedMainJobs":[string],"failedPreCachingCheckJobs":[string]} |
CustomJobDetail
The detailed info for a custom job executor.
jobstringOutput only. The name of theCustomJob.
failedJobs[]stringOutput only. The names of the previously failedCustomJob. The list includes the all attempts in chronological order.
| JSON representation |
|---|
{"job":string,"failedJobs":[string]} |
State
Specifies state of TaskExecution
| Enums | |
|---|---|
STATE_UNSPECIFIED | Unspecified. |
PENDING | Specifies pending state for the task. |
RUNNING | Specifies task is being executed. |
SUCCEEDED | Specifies task completed successfully. |
CANCEL_PENDING | Specifies Task cancel is in pending state. |
CANCELLING | Specifies task is being cancelled. |
CANCELLED | Specifies task was cancelled. |
FAILED | Specifies task failed. |
SKIPPED | Specifies task was skipped due to cache hit. |
NOT_TRIGGERED | Specifies that the task was not triggered because the task's trigger policy is not satisfied. The trigger policy is specified in thecondition field ofPipelineJob.pipeline_spec. |
PipelineTaskStatus
A single record of the task status.
updateTimestring (Timestamp format)Output only. Update time of this status.
Uses RFC 3339, where generated output will always be Z-normalized and use 0, 3, 6 or 9 fractional digits. Offsets other than "Z" are also accepted. Examples:"2014-10-02T15:01:23Z","2014-10-02T15:01:23.045123456Z" or"2014-10-02T15:01:23+05:30".
stateenum (State)Output only. The state of the task.
errorobject (Status)Output only. The error that occurred during the state. May be set when the state is any of the non-final state (PENDING/RUNNING/CANCELLING) or FAILED state. If the state is FAILED, the error here is final and not going to be retried. If the state is a non-final state, the error indicates a system-error being retried.
ArtifactList
A list of artifact metadata.
artifacts[]object (Artifact)Output only. A list of artifact metadata.
| JSON representation |
|---|
{"artifacts":[{object ( |
RuntimeConfig
The runtime config of a PipelineJob.
parameters
(deprecated)map (key: string, value: object (Value))This item is deprecated!
Deprecated. UseRuntimeConfig.parameter_values instead. The runtime parameters of the PipelineJob. The parameters will be passed intoPipelineJob.pipeline_spec to replace the placeholders at runtime. This field is used by pipelines built usingPipelineJob.pipeline_spec.schema_version 2.0.0 or lower, such as pipelines built using Kubeflow Pipelines SDK 1.8 or lower.
gcsOutputDirectorystringRequired. A path in a Cloud Storage bucket, which will be treated as the root output directory of the pipeline. It is used by the system to generate the paths of output artifacts. The artifact paths are generated with a sub-path pattern{job_id}/{taskId}/{outputKey} under the specified output directory. The service account specified in this pipeline must have thestorage.objects.get andstorage.objects.create permissions for this bucket.
parameterValuesmap (key: string, value: value (Value format))The runtime parameters of the PipelineJob. The parameters will be passed intoPipelineJob.pipeline_spec to replace the placeholders at runtime. This field is used by pipelines built usingPipelineJob.pipeline_spec.schema_version 2.1.0, such as pipelines built using Kubeflow Pipelines SDK 1.9 or higher and the v2 DSL.
failurePolicyenum (PipelineFailurePolicy)Represents the failure policy of a pipeline. Currently, the default of a pipeline is that the pipeline will continue to run until no more tasks can be executed, also known as PIPELINE_FAILURE_POLICY_FAIL_SLOW. However, if a pipeline is set to PIPELINE_FAILURE_POLICY_FAIL_FAST, it will stop scheduling any new tasks when a task has failed. Any scheduled tasks will continue to completion.
inputArtifactsmap (key: string, value: object (InputArtifact))The runtime artifacts of the PipelineJob. The key will be the input artifact name and the value would be one of the InputArtifact.
| JSON representation |
|---|
{"parameters":{string:{object ( |
Value
value is the value of the field.
valueUnion typevalue can be only one of the following:intValuestring (int64 format)An integer value.
doubleValuenumberA double value.
stringValuestringA string value.
| JSON representation |
|---|
{// value"intValue":string,"doubleValue":number,"stringValue":string// Union type} |
PipelineFailurePolicy
Represents the failure policy of a pipeline. Currently, the default of a pipeline is that the pipeline will continue to run until no more tasks can be executed, also known as PIPELINE_FAILURE_POLICY_FAIL_SLOW. However, if a pipeline is set to PIPELINE_FAILURE_POLICY_FAIL_FAST, it will stop scheduling any new tasks when a task has failed. Any scheduled tasks will continue to completion.
| Enums | |
|---|---|
PIPELINE_FAILURE_POLICY_UNSPECIFIED | Default value, and follows fail slow behavior. |
PIPELINE_FAILURE_POLICY_FAIL_SLOW | Indicates that the pipeline should continue to run until all possible tasks have been scheduled and completed. |
PIPELINE_FAILURE_POLICY_FAIL_FAST | Indicates that the pipeline should stop scheduling new tasks after a task has failed. |
InputArtifact
The type of an input artifact.
kindUnion typekind can be only one of the following:artifactIdstringArtifact resource id from MLMD. Which is the last portion of an artifact resource name:projects/{project}/locations/{location}/metadataStores/default/artifacts/{artifactId}. The artifact must stay within the same project, location and default metadatastore as the pipeline.
| JSON representation |
|---|
{// kind"artifactId":string// Union type} |
PipelineTemplateMetadata
Pipeline template metadata ifPipelineJob.template_uri is from supported template registry. Currently, the only supported registry is Artifact Registry.
versionstringThe versionName in artifact registry.
Will always be presented in output if thePipelineJob.template_uri is from supported template registry.
Format is "sha256:abcdef123456...".
| JSON representation |
|---|
{"version":string} |
Methods | |
|---|---|
| Batch cancel PipelineJobs. |
| Batch deletes PipelineJobs The Operation is atomic. |
| Cancels a PipelineJob. |
| Creates a PipelineJob. |
| Deletes a PipelineJob. |
| Gets a PipelineJob. |
| Lists PipelineJobs in a Location. |
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-01-16 UTC.