Submit a Spark job by using a template
This page shows you how to use anGoogle APIs Explorer template torun a simple Spark job on an existing Dataproc cluster.
For other ways to submit a job to a Dataproc cluster, see:
Before you begin
Before you can run a Dataproc job, you must create a clusterof one or more virtual machines (VMs) to run it on. You can use theAPIs Explorer, theGoogle Cloud console,the gcloud CLIgcloud command-line tool,or theQuickstarts using Cloud Client Librariesto create a cluster.Submit a job
To submit a sampleApache Spark job that calculates a rough value forpi, fill in andexecute the Google APIs ExplorerTry this API template.
Note: Theregion,clusterName andjob parameter values are filled in for you.Confirm or replace theregion andclusterName parameter values to matchyour cluster's region and name. Thejobparameter values are required to run the a Spark job that is pre-installed on theDataproc cluster's master node.Request parameters:
Request body:
- job.placement.clusterName:The name of the cluster where the job will run (confirm or replace "example-cluster").
- job.sparkJob.args:"1000", the number of job tasks.
- job.sparkJob.jarFileUris:"file:///usr/lib/spark/examples/jars/spark-examples.jar". This isthe local file path on the Dataproc cluster's master nodewhere the jar that contains the Spark Scala job code is installed.
- job.sparkJob.mainClass:"org.apache.spark.examples.SparkPi". The is the main method ofthe job's pi calculation Scala application.
ClickEXECUTE. The first time yourun the API template, you may be asked to choose and sign intoyour Google account, then authorize the Google APIs Explorer to access youraccount. If the request is successful, the JSON responseshows that job submission request is pending.
To view job output, open theDataproc Jobs pagein the Google Cloud console, then click the top (most recent) Job ID.Click "LINE WRAP" to ON to bring lines that exceed the right margin into view.
...Pi is roughly 3.141804711418047...
Clean up
To avoid incurring charges to your Google Cloud account for the resources used on this page, follow these steps.
- If you don't need the cluster to explore the other quickstarts or to runother jobs, use theAPIs Explorer, theGoogle Cloud console,the gcloud CLIgcloud command-line tool,or theCloud Client Librariesto delete the cluster.
What's next
- Learn how toupdate a Dataproc cluster by using a template.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.