Dataproc Identity and Access Management roles and permissions Stay organized with collections Save and categorize content based on your preferences.
Identity and Access Management (IAM) lets you controluser and group access to project resources. This document focuses onthe IAMpermissions relevant to Dataproc and theIAMroles that grant those permissions.
Dataproc permissions
Security requirement beginning August 3, 2020:Dataproc users are required to haveservice accountActAs permission to deploy Dataproc resources, for example, to create clusters and instantiate workflows. SeeRoles for service account authenticationfor detailed information about service account permissions.Opt-in for existing Dataproc users:Existing Dataproc users as of August 3, 2020 can opt in to this security requirement (seeSecuring Dataproc,Dataflow, and Cloud Data Fusion).
Dataproc permissions allow users, includingservice accounts,to perform actions on Dataprocclusters, jobs, operations, and workflow templates. For example, thedataproc.clusters.createpermission allows a user to create Dataproc clusters in a project.Typically, you don't grant permissions; instead, you grantroles, which include one or more permissions.
The following tables list the permissions necessary to call DataprocAPIs (methods). The tables are organized according to the APIs associated witheach Dataproc resource (clusters, jobs, operations, and workflow templates).
Permission Scope: The scope of Dataprocpermissions listed in the following tables is the containing Google Cloudproject (cloud-platform scope). SeeService account permissions.
Examples:
dataproc.clusters.createpermits the creation ofDataproc clusters in the containing projectdataproc.jobs.createpermits the submission of Dataprocjobs to Dataproc clusters in the containing projectdataproc.clusters.listpermits the listing of detailsof Dataproc clusters in the containing project
Clusters methods required permissions
| Method | Required permissions |
|---|---|
| projects.regions.clusters.create1, 2 | dataproc.clusters.create |
| projects.regions.clusters.get | dataproc.clusters.get |
| projects.regions.clusters.list | dataproc.clusters.list |
| projects.regions.clusters.patch1, 2, 3 | dataproc.clusters.update |
| projects.regions.clusters.delete1 | dataproc.clusters.delete |
| projects.regions.clusters.start | dataproc.clusters.start |
| projects.regions.clusters.stop | dataproc.clusters.stop |
| projects.regions.clusters.getIamPolicy | dataproc.clusters.getIamPolicy |
| projects.regions.clusters.setIamPolicy | dataproc.clusters.setIamPolicy |
Notes:
- The
dataproc.operations.getpermission is also required to get statusupdates from Google Cloud CLI. - The
dataproc.clusters.getpermission is also required to get the resultof the operation from Google Cloud CLI. dataproc.autoscalingPolicies.usepermission is also required toenable an autoscaling policy on a cluster.
Jobs methods required permissions
| Method | Required permissions |
|---|---|
| projects.regions.jobs.submit1, 2 | dataproc.jobs.create dataproc.clusters.use |
| projects.regions.jobs.get | dataproc.jobs.get |
| projects.regions.jobs.list | dataproc.jobs.list |
| projects.regions.jobs.cancel1 | dataproc.jobs.cancel |
| projects.regions.jobs.patch1 | dataproc.jobs.update |
| projects.regions.jobs.delete1 | dataproc.jobs.delete |
| projects.regions.jobs.getIamPolicy | dataproc.jobs.getIamPolicy |
| projects.regions.jobs.setIamPolicy | dataproc.jobs.setIamPolicy |
Notes:
The Google Cloud CLI also requires
dataproc.jobs.getpermission for thejobs submit,jobs wait,jobs update,jobs delete,andjobs killcommands.The gcloud CLI also requires
dataproc.clusters.getpermission to submit jobs. For an example of settingthe permissions necessary for a user to rungcloud dataproc jobs submiton acluster using Dataproc Granular IAM (seeSubmitting Jobs with Granular IAM).
Operations methods required permissions
| Method | Required permissions |
|---|---|
| projects.regions.operations.get | dataproc.operations.get |
| projects.regions.operations.list | dataproc.operations.list |
| projects.regions.operations.cancel | dataproc.operations.cancel |
| projects.regions.operations.delete | dataproc.operations.delete |
| projects.regions.operations.getIamPolicy | dataproc.operations.getIamPolicy |
| projects.regions.operations.setIamPolicy | dataproc.operations.setIamPolicy |
Workflow templates methods required permissions
| Method | Required permissions |
|---|---|
| projects.regions.workflowTemplates.instantiate | dataproc.workflowTemplates.instantiate |
| projects.regions.workflowTemplates.instantiateInline | dataproc.workflowTemplates.instantiateInline |
| projects.regions.workflowTemplates.create | dataproc.workflowTemplates.create |
| projects.regions.workflowTemplates.get | dataproc.workflowTemplates.get |
| projects.regions.workflowTemplates.list | dataproc.workflowTemplates.list |
| projects.regions.workflowTemplates.update | dataproc.workflowTemplates.update |
| projects.regions.workflowTemplates.delete | dataproc.workflowTemplates.delete |
| projects.regions.workflowTemplates.getIamPolicy | dataproc.workflowTemplates.getIamPolicy |
| projects.regions.workflowTemplates.setIamPolicy | dataproc.workflowTemplates.setIamPolicy |
Notes:
Workflow Template permissions are independent of Cluster and Job permissions.A user without
create clusterorsubmit jobpermissions may create andinstantiate a Workflow Template.The Google Cloud CLI additionally requires
dataproc.operations.getpermission to poll for workflow completion.The
dataproc.operations.cancelpermission is required to cancel a runningworkflow.
Autoscaling policies methods required permissions
| Method | Required permissions |
|---|---|
| projects.regions.autoscalingPolicies.create | dataproc.autoscalingPolicies.create |
| projects.regions.autoscalingPolicies.get | dataproc.autoscalingPolicies.get |
| projects.regions.autoscalingPolicies.list | dataproc.autoscalingPolicies.list |
| projects.regions.autoscalingPolicies.update | dataproc.autoscalingPolicies.update |
| projects.regions.autoscalingPolicies.delete | dataproc.autoscalingPolicies.delete |
| projects.regions.autoscalingPolicies.getIamPolicy | dataproc.autoscalingPolicies.getIamPolicy |
| projects.regions.autoscalingPolicies.setIamPolicy | dataproc.autoscalingPolicies.setIamPolicy |
Notes:
dataproc.autoscalingPolicies.usepermission is required toenable an autoscaling policy on a cluster with aclusters.patchmethodrequest.
Node groups methods required permissions
| Method | Required permissions |
|---|---|
| projects.regions.nodeGroups.create | dataproc.nodeGroups.create |
| projects.regions.nodeGroups.get | dataproc.nodeGroups.get |
| projects.regions.nodeGroups.resize | dataproc.nodeGroups.update |
Dataproc roles
Dataproc IAM rolesare a bundle of one or morepermissions.You grant roles to users or groups to allow them to perform actions on theDataproc resources in a project. For example,theDataproc Viewer role containget andlist permissions, which allow a user to get and listDataproc clusters, jobs, and operations in a project.
The following table lists roles that contain the permissions required tocreate and manage Dataproc clusters.
| Grant role to | Roles |
|---|---|
| User | Grant users the following roles:
|
| Service account | Grant theDataproc VM service account theDataproc Worker role. |
Note the following:
You might need to grant theDataproc VM service account additional predefined orcustom roles that contain the permissions necessary for other operations,such as reading and writing data from and to Cloud Storage, BigQuery,Cloud Logging, and other Google Cloud resources.
In some projects, theDataproc VM service accountmay have been automatically granted the projectEditor role, which includes the DataprocWorker role permissions plus additional permissions not needed forDataproc data plane operations. To follow the security bestpractice principle ofleast privilege,replace theEditor role with theDataproc Worker role (seeView VM service account roles).
Do you need to grant roles?
Depending on your organization policy, a required role may already have beengranted.
Check roles granted to users
To see if a user has been granted a role, follow the instructions inManage access to projects, folders, and organizations > View current access.
Check roles granted to service accounts
To see if the a service account has been granted a role, seeView and manage IAM service account roles.
Check roles granted on a service account
To see if a user has been granted a role on a service account,follow the instructions inManage access to service accounts > View current access.
Look up Dataproc roles and permissions
You can use the following section to look up Dataproc roles andpermissions.
| Role | Permissions |
|---|---|
Dataproc Administrator( Full control of Dataproc resources. |
|
Dataproc Editor( Provides the permissions necessary for viewing the resources required tomanage Dataproc, including machine types, networks, projects,and zones. Lowest-level resources where you can grant this role:
|
|
Dataproc Hub Agent( Allows management of Dataproc resources. Intended for service accounts running Dataproc Hub instances. |
|
Dataproc Serverless Editor( Permissions needed to run serverless sessions and batches as a user |
|
Dataproc Serverless Node.( Node access to Dataproc Serverless sessions and batches. Intended for service accounts. |
|
Dataproc Serverless Viewer( Permissions needed to view serverless sessions and batches |
|
Dataproc Service Agent( Gives Dataproc Service Account access to service accounts, compute resources, storage resources, and kubernetes resources. Includes access to service accounts. Warning: Do not grant service agent roles to any principals exceptservice agents. |
|
Dataproc Viewer( Provides read-only access to Dataproc resources. Lowest-level resources where you can grant this role:
|
|
Dataproc Worker( Provides worker access to Dataproc resources. Intended for service accounts. |
|
Notes:
computepermissions are needed or recommended tocreate and view Dataproc clusters when using the Google Cloud consoleor the gcloud CLI Google Cloud CLI.- To allow a user to upload files, grant the
Storage Object Creatorrole.To allow a user to view job output, grant theStorage Object Viewerrole. - A user must have
monitoring.timeSeries.listpermission in order toview graphs on the Google Cloud console→Dataproc→Cluster detailsOverview tab. - A user must have
compute.instances.listpermissionin order to view instance status and the master instance SSH menu on theGoogle Cloud console→Dataproc→Cluster detailsVM Instances tab. For information on Compute Engine roles, seeCompute Engine→Available IAM roles). - To create a cluster with a user-specified service account, the specifiedservice account must have all permissions granted by the
Dataproc Workerrole, which include access to the Dataprocstaging and temp buckets.Additional roles may be required depending on configured features.SeeCreate a cluster with a custom VM service account formore information.
Project roles
You can also set permissions at the project level by using the IAMProject roles. The following table lists permissions associated withIAM Project roles:
| Project Role | Permissions |
|---|---|
| Project Viewer | All project permissions for read-only actions that preserve state (get, list) |
| Project Editor | All Project Viewer permissions plus all project permissions for actions that modify state (create, delete, update, use, cancel, stop, start) |
| Project Owner | All Project Editor permissions plus permissions to manage access control for the project (get/set IamPolicy) and to set up project billing |
IAM roles and Dataproc operations summary
The following table lists Dataproc operations associated withproject and Dataproc roles.
| Operation | Project Editor | Project Viewer | Dataproc Admin | Dataproc Editor | Dataproc Viewer |
|---|---|---|---|---|---|
| Get/Set Dataproc IAM permissions | No | No | Yes | No | No |
| Create cluster | Yes | No | Yes | Yes | No |
| List clusters | Yes | Yes | Yes | Yes | Yes |
| Get cluster details | Yes | Yes | Yes1, 2 | Yes1, 2 | Yes1, 2 |
| Update cluster | Yes | No | Yes | Yes | No |
| Delete cluster | Yes | No | Yes | Yes | No |
| Start/Stop cluster | Yes | No | Yes | Yes | No |
| Submit job | Yes | No | Yes3 | Yes3 | No |
| List jobs | Yes | Yes | Yes | Yes | Yes |
| Get job details | Yes | Yes | Yes4 | Yes4 | Yes4 |
| Cancel job | Yes | No | Yes | Yes | No |
| Delete job | Yes | No | Yes | Yes | No |
| List operations | Yes | Yes | Yes | Yes | Yes |
| Get operation details | Yes | Yes | Yes | Yes | Yes |
| Delete operation | Yes | No | Yes | Yes | No |
Notes:
- The performance graph is not available unless the user also has arole with the
monitoring.timeSeries.listpermission. - The list of VMs in the cluster will not include status informationor an SSH link for the master instance unless the user also has a role withthe
compute.instances.listpermission. - Jobs that upload files require the user to have the
Storage Object Creatorrole or write access to the Dataprocstaging bucket. - Job output is not available unless the user also has the Storage ObjectViewer role or has been granted read access to the staging bucket forthe project.
Dataproc VM access scopes
VM Access scopes and IAM roles work together to limit VM access to Google CloudAPIs. For example, if cluster VMs are granted only thehttps://www.googleapis.com/auth/storage-full scope, applications runningon cluster VMs can call Cloud Storage APIs, but they are not able tomake requests to BigQuery, even if they are running as a VM service accountthat had been granted a BigQuery role with broad permissions.
A best practice is to grant the broadcloud-platform scope(https://www.googleapis.com/auth/cloud-platform)to VMs, and then limit VM access by granting specificIAM roles to theVM service account (seeScopes best practice).
cloud-platform scope is applied by default toDataproc cluster VMs created with Dataprocimage version 2.1 and higher.Default Dataproc VM scopes. If scopes are notspecified when a cluster is created (seegcloud dataproc cluster create --scopes),Dataproc VMs have the following default set of scopes:
https://www.googleapis.com/auth/cloud-platform (clusters created with image version2.1+).https://www.googleapis.com/auth/bigqueryhttps://www.googleapis.com/auth/bigtable.admin.tablehttps://www.googleapis.com/auth/bigtable.datahttps://www.googleapis.com/auth/cloud.useraccounts.readonlyhttps://www.googleapis.com/auth/devstorage.full_controlhttps://www.googleapis.com/auth/devstorage.read_writehttps://www.googleapis.com/auth/logging.writeIf you specify scopes when creating a cluster,cluster VMs will have thescopes you specifyand the following minimum set ofrequired scopes (even if you don't specifythem):
https://www.googleapis.com/auth/cloud-platform (clusters created with image version2.1+).https://www.googleapis.com/auth/cloud.useraccounts.readonlyhttps://www.googleapis.com/auth/devstorage.read_writehttps://www.googleapis.com/auth/logging.writeIAM allow policy management
You grant IAM roles to principals using allow policies.You can get and set allow policies using the Google Cloud console,the IAM API, or the Google Cloud CLI.
- For the Google Cloud console, seeAccess control using the Google Cloud console.
- For the API, seeAccess control using the API.
- For the Google Cloud CLI, seeAccess control using theGoogle Cloud CLI.
What's next
- Learn about Dataproc principals and roles
- Learn about Dataproc Granular IAM
- Learn more about IAM.
- Learn about Service accounts in Dataproc
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.