Dataflow support for TPUs

Note: The Dataflow TPU offering is generally available with anallowlist. To get access to this feature, reach out to your account team.

Google Cloud Platform TPUs are custom-designed AI accelerators created by Google that areoptimized for training and using of large AI models. They are designed toscale cost-efficiently for a wide range of AI workloads and provide versatilityto accelerate inference workloads on AI frameworks, including PyTorch, JAX, andTensorFlow. For more details about TPUs, seeIntroduction toGoogle Cloud Platform TPU.

Prerequisites for using TPUs in Dataflow

  • Your Google Cloud projects must be approved to use this GA offering.

Limitations

This offering is subject to the following limitations:

  • Only single-host TPU accelerators are supported: TheDataflow TPU offering supports only single-host TPUconfigurations where each Dataflow worker manages one or manyTPU devices that are not interconnected with TPUs managed by other workers.
  • Only homogenous TPU worker pools are supported: Features likeDataflow right fitting and Dataflow Primedon't support TPU workloads.

Pricing

Dataflow jobs that use TPUs are billed for worker TPU chip-hoursconsumed and are not billed for worker CPU and memory. For more information, seethe Dataflowpricing page.

Availability

The following TPU accelerators and processing regions are available.

Supported TPU accelerators

The supported TPU accelerator combinations are identified by the tuple (TPUtype, TPU topology).

  • TPU type refers to the model of the TPU device.
  • TPU topology refers to the number and physical arrangement of the TPUchips in a slice.

To configure the type and topology of TPUs for Dataflow workers,use theworker_accelerator pipelineoption formatted astype:TPU_TYPE;topology:TPU_TOPOLOGY.

The following TPU configurations are supported with Dataflow:

TPU typeTopologyRequiredworker_machine_type
tpu-v5-lite-podslice1x1ct5lp-hightpu-1t
tpu-v5-lite-podslice2x2ct5lp-hightpu-4t
tpu-v5-lite-podslice2x4ct5lp-hightpu-8t
tpu-v6e-slice1x1ct6e-standard-1t
tpu-v6e-slice2x2ct6e-standard-4t
tpu-v6e-slice2x4ct6e-standard-8t
tpu-v5p-slice2x2x1ct5p-hightpu-4t

Regions

For information about available regions and zones for TPUs, seeTPU regions andzones in the Cloud TPU documentation.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.