gcloud dataflow flex-template run Stay organized with collections Save and categorize content based on your preferences.
- NAME
- gcloud dataflow flex-template run - runs a job from the specified path
- SYNOPSIS
gcloud dataflow flex-template runJOB_NAME--template-file-gcs-location=TEMPLATE_FILE_GCS_LOCATION[--additional-experiments=[ADDITIONAL_EXPERIMENTS,…]][--additional-pipeline-options=[ADDITIONAL_PIPELINE_OPTIONS,…]][--additional-user-labels=[ADDITIONAL_USER_LABELS,…]][--dataflow-kms-key=DATAFLOW_KMS_KEY][--disable-public-ips][--enable-streaming-engine][--flexrs-goal=FLEXRS_GOAL][--launcher-machine-type=LAUNCHER_MACHINE_TYPE][--max-workers=MAX_WORKERS][--network=NETWORK][--num-workers=NUM_WORKERS][--parameters=[PARAMETERS,…]][--region=REGION_ID][--service-account-email=SERVICE_ACCOUNT_EMAIL][--staging-location=STAGING_LOCATION][--subnetwork=SUBNETWORK][--temp-location=TEMP_LOCATION][--worker-machine-type=WORKER_MACHINE_TYPE][[--[no-]update:--transform-name-mappings=[TRANSFORM_NAME_MAPPINGS,…]]][--worker-region=WORKER_REGION|--worker-zone=WORKER_ZONE][GCLOUD_WIDE_FLAG …]
- DESCRIPTION
- Runs a job from the specified flex template gcs path.
- EXAMPLES
- To run a job from the flex template, run:
gclouddataflowflex-templaterunmy-job--template-file-gcs-location=gs://flex-template-path--region=europe-west1--parameters=input="gs://input",output="gs://output-path"--max-workers=5 - POSITIONAL ARGUMENTS
JOB_NAME- Unique name to assign to the job.
- REQUIRED FLAGS
--template-file-gcs-location=TEMPLATE_FILE_GCS_LOCATION- Google Cloud Storage location of the flex template to run. (Must be a URLbeginning with 'gs://'.)
- OPTIONAL FLAGS
--additional-experiments=[ADDITIONAL_EXPERIMENTS,…]- Additional experiments to pass to the job. Example:--additional-experiments=experiment1,experiment2=value2
--additional-pipeline-options=[ADDITIONAL_PIPELINE_OPTIONS,…]- Additional pipeline options to pass to the job. Example:--additional-pipeline-options=option1=value1,option2=value2
--additional-user-labels=[ADDITIONAL_USER_LABELS,…]- Additional user labels to pass to the job. Example:--additional-user-labels='key1=value1,key2=value2'
--dataflow-kms-key=DATAFLOW_KMS_KEY- Cloud KMS key to protect the job resources.
--disable-public-ips- Cloud Dataflow workers must not use public IP addresses. Overrides the default
dataflow/disable_public_ipsproperty value for this commandinvocation. --enable-streaming-engine- Enabling Streaming Engine for the streaming job. Overrides the default
dataflow/enable_streaming_engineproperty value for this commandinvocation. --flexrs-goal=FLEXRS_GOAL- FlexRS goal for the flex template job.
FLEXRS_GOALmustbe one of:COST_OPTIMIZED,SPEED_OPTIMIZED. --launcher-machine-type=LAUNCHER_MACHINE_TYPE- The machine type to use for launching the job. The default isn1-standard-1.
--max-workers=MAX_WORKERS- Maximum number of workers to run.
--network=NETWORK- Compute Engine network for launching instances to run your pipeline.
--num-workers=NUM_WORKERS- Initial number of workers to use.
--parameters=[PARAMETERS,…]- Parameters to pass to the job.
--region=REGION_ID- Region ID of the job's regional endpoint. Defaults to 'us-central1'.
--service-account-email=SERVICE_ACCOUNT_EMAIL- Service account to run the workers as.
--staging-location=STAGING_LOCATION- Default Google Cloud Storage location to stage local files.(Must be a URLbeginning with 'gs://'.)
--subnetwork=SUBNETWORK- Compute Engine subnetwork for launching instances to run your pipeline.
--temp-location=TEMP_LOCATION- Default Google Cloud Storage location to stage temporary files. If not set,defaults to the value for --staging-location.(Must be a URL beginning with'gs://'.)
--worker-machine-type=WORKER_MACHINE_TYPE- Type of machine to use for workers. Defaults to server-specified.
--[no-]update- Set this to true for streaming update jobs. Use
--updateto enableand--no-updateto disable. --transform-name-mappings=[TRANSFORM_NAME_MAPPINGS,…]- Transform name mappings for the streaming update job.
- At most one of these can be specified:
--worker-region=WORKER_REGION- Region to run the workers in.
--worker-zone=WORKER_ZONE- Zone to run the workers in.
- GCLOUD WIDE FLAGS
- These flags are available to all commands:
--access-token-file,--account,--billing-project,--configuration,--flags-file,--flatten,--format,--help,--impersonate-service-account,--log-http,--project,--quiet,--trace-token,--user-output-enabled,--verbosity.Run
$gcloud helpfor details. - NOTES
- This variant is also available:
gcloudbetadataflowflex-templaterun
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-05-07 UTC.