Troubleshoot batch and session connectivity

This page provides guidance on diagnosing and resolving common networkconnectivity issues for Serverless for Apache Spark batch workloads and interactive sessions.These issues can prevent your workloads from accessing required data sources,external services, or Google Cloud APIs.

Common symptoms and error messages

When Serverless for Apache Spark encounters connectivity problems, you mightencounter errors such as:

  • Unable to connect to service_name.googleapis.com
  • Could not reach required Google APIs
  • Connection refused
  • Host unreachable
  • Operation timed out
  • Permission denied (often network-related if blocking API calls)

You might also encounter errors related to accessing data in Cloud Storage,BigQuery, or other databases.

Common causes and troubleshooting tips

This section lists common causes of Serverless for Apache Sparkconnectivity issues, and provides troubleshooting tips to help you resolve them.

Network configuration

Network misconfigurations are a frequent cause of connectivity failures.Serverless for Apache Spark workloads and sessions run on VMs with internalIP addresses, withPrivate Google Access (PGA)automatically enabled on the workload or session subnet to accessto Google APIs and services. For more information, seeServerless for Apache Spark network configuration.

  • Access options:

    • Private Service Connect (PSC): You cancreate private endpointswithin your VPC network to access specific Google APIs.

      • In the Google Cloud console, go toPrivate Service Connect > Endpoints.Connect endpoints or confirm that endpoints are connected forall required APIs, such asstorage.googleapis.com anddataproc.googleapis.comand that they connect to the batch workload or sessionVirtual Private Cloud network.
    • Cloud NAT: If your workload needs to access the publicinternet, you can configure Cloud NAT foryour batch workload or session subnet:

      • In the Google Cloud console, go to theCloud NAT page.Configure a gateway or confirm that agateway is configured for the batchworkload or session VPC network, region, and subnet. Alsomake sure firewall rules allowegress to0.0.0.0/0. For more information, seeSet up Cloud NAT.
  • Firewall rules:

    • Egress firewall rules in your VPCnetwork (or shared VPC network host project, if applicable) mustnot block outbound traffic to required destinations.
      • If applicable, egress rules must allow traffic to external services,such as public APIs and databases outside of Google Cloud.If your batch workload or session needs internet access,you can use aCloud NATto provide subnet egress.
    • Although not a common cause of connectivity issues,overly restrictive ingress rules might inadvertently blocknecessary return traffic or internal communications.
  • DNS resolution:

    • DNS resolution must be configured within the VPCnetwork. Workloads and sessions must be able to resolve hostnames forGoogle APIs, such asstorage.googleapis.com orbigquery.googleapis.comand external services.
    • Custom DNS servers and Cloud DNS private zones must forward or resolvequeries for Google domains.
    • If you are using Private Service Connect for private access toGoogle APIs, DNS records for Google services must resolve to private IP addresseswithin your VPC network using the PSC endpoint.

Troubleshooting tips:

  • Identify network and subnet configuration:

    • From Serverless for Apache Spark batch or session details, reviewthenetworkUri andsubnetUri.
    • In the Google Cloud console, review the settings for theVPC network and subnet.
  • Test connectivity from a Proxy VM:

    • Launch a test Compute Engine VM in the batch or sessionsubnet using the batch or session service account.
    • From the test VM, perform the following connectivity tests:
      • nslookup storage.googleapis.com to verify DNS resolution.Lookup other Google API domains, such asbigquery.googleapis.comanddataproc.googleapis.com. With Private Google Access, which is automatically enabled on Serverless for Apache Spark subnets, orPrivate Service Connect, the domains must resolve to privateIP addresses.
      • curl -v https://storage.googleapis.com to verify HTTPS connectivityto Google APIs. Also try connecting to other Google services.
      • ping 8.8.8.8 to test internet connectivity if required by yourbatch or session. Trycurl -v https://example.com ifCloud NAT is expected.
    • Run Google CloudNetwork Intelligence Center connectivity teststo diagnose network paths from your subnet to relevant endpoints,such as Google APIs and external IP addresses.
  • Review Cloud Logging for network errors:

    • Review Logging for your Serverless for Apache Spark workload orsession. Look forERROR orWARNING messages related to network timeouts,connection refusals, or API call failures. Filter byjsonPayload.component="driver"orjsonPayload.component="executor" for Spark-specific network issues.

IAM permissions

Insufficient IAM permissions can prevent workloads or sessions fromaccessing resources, resulting in network failures if API calls are denied.

The service account used by your batch workload or session must have requiredroles:

  • Dataproc Worker role (roles/dataproc.worker).
  • Data access roles, such asroles/storage.objectViewer orroles/bigquery.dataViewer).
  • Logging: (roles/logging.logWriter).

Troubleshooting tips:

External service configuration

If your workload connects to databases or services outside of Google Cloud,verify their configuration:

  • Verify the external service firewall orsecurity group allows inbound connections from your VPC networkIP ranges: if applicable, check internal IP addresses using VPC Peering, Cloud VPN, or Cloud Interconnect,or Cloud NAT IP addresses.
  • Review database credentials or connectivity strings. Check connectiondetails, usernames, and passwords.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2026-02-19 UTC.