Control Dataflow log ingestion Stay organized with collections Save and categorize content based on your preferences.
Exclusion filters let youcontrol the volume of Dataflow logs ingested by Cloud Logging whilestill making verbose logging available for debugging. You can use exclusionfilters to exclude matching log entries from being ingestedby Cloud Logging or from being routed to the destination of thesink.Create exclusion filters by using theLogging query language.Logging query language lets you specify a subset of all log entries inyour selected Google Cloud resource, such as a project or a folder.
By using exclusion filters, you can reduce the Cloud Loggingcosts incurred by Dataflow log ingestion. For more information about log ingestion pricing forCloud Logging, see theCloud Logging pricing summary.For more details about how exclusion filters work and their limitations, seeExclusion filters in theCloud Logging documentation.
Dataflow jobs emit multiplelog types.This page demonstrates how to filter Dataflow job logs and worker logs.
Create log exclusion filters
This example creates an exclusion filter on the_Default Cloud Logging sink. Thefilter excludes allDEFAULT,DEBUG,INFO, andNOTICE severity Dataflowlogs from being ingested into Cloud Logging.WARNING,ERROR,CRITICAL,ALERT, andEMERGENCY severity logs are still captured. For more information aboutsupported log levels, seeLogSeverity.
Before you begin
- Sign in to your Google Cloud Platform account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Roles required to select or create a project
- Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
- Create a project: To create a project, you need the Project Creator role (
roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.createpermission.Learn how to grant roles.
Verify that billing is enabled for your Google Cloud project.
Permissions
As you get started, ensure the following:
You have a Google Cloud project with logs that you can see in theLogs Explorer.
You have one of the following IAM roles for the sourceGoogle Cloud project from which you're routing logs.
- Owner (
roles/owner) - Logging Admin (
roles/logging.admin) - Logs Configuration Writer (
roles/logging.configWriter)
The permissions contained in these roles let you create, delete, ormodify sinks. For information on setting IAM roles, see theLoggingAccess control guide.
- Owner (
You have a resource in asupported destination orcan create one.
You need to create the routing destination before the sink, througheither Google Cloud CLI, Google Cloud console, or the Google CloudAPIs. You can create the destination in any Google Cloud project in anyorganization. Before you create the destination, make sure the serviceaccount from the sink haspermissions to write to the destination.
Add an exclusion filter
The following steps demonstrate how to add a Cloud Logging exclusion filterto your Dataflow logs. This exclusion filter selects all Dataflowlog entries with the severityDEFAULT,DEBUG,INFO, andNOTICE from jobsthat have a Dataflow job name that does not end in the stringdebug. The filterexcludes these logs from ingestion into theDefault Cloud Logging bucket.
In the Google Cloud console, go to theLogs Router page:
Find the row with the
_Defaultsink, expand theActions option, and then clickEdit sink.InChoose logs to filter out of sink, forBuild an exclusion filter,clickAdd exclusion.
Enter a name for your exclusion filter.
In theBuild an exclusion filter section, paste the following text intothe box:
resource.type="dataflow_step"ANDlabels."dataflow.googleapis.com/job_name"!~".*debug"ANDseverity=(DEFAULTORDEBUGORINFOORNOTICE)- The first line selects all log entries generated by the Dataflow service.
- The second line selects all log entries where the
job_namefield does notend with the stringdebug. - The third line selects all log entries with the severity
DEFAULT,DEBUG,INFO, orNOTICE.
ClickUpdate sink.
Test your exclusion filter
You can verify that the filter is working correctly by running a sampleDataflow job and then viewing the logs.
After your job starts running, to view job logs, complete the following steps:
In the Google Cloud console, go to the DataflowJobs page.
A list of Dataflow jobs appears along with their status.
Select a job.
On theJob details page, in theLogs panel, clicksegmentShow.
Verify that no logs appear in theJob logs panel and that no
DEFAULT,DEBUG,INFO, orNOTICElogs appear in theWorker logs panel.
Bypass the exclusion filter
The Dataflow job name (job_name) is used to provide a bypass mechanism forscenarios where the generated Dataflow logs need to be captured. You canuse this bypass to rerun a failed job and capture all the log information.
The filter created in this scenario retains all log entries when thejob_name fieldends with the stringdebug. When you want to bypass the exclusion filter and display all logs for aDataflow job, appenddebug to the job name. For example, to bypass the exclusion filter, youcould use the job namedataflow-job-debug.
Compare log counts
If you want to compare the volume of logs ingested with and without the exclusionfilter, run one job withdebug appended to the job name and one without. Usethe system-defined, logs-based metricLog bytes to view and comparethe ingestion data. For more information about viewing ingestion data, seeView ingestion data in Metrics Explorer.
Create an external destination
Optionally, after you create the exclusion filter, you can create an additionalCloud Logging sink. Use this sink to redirect the complete set ofDataflow logs into asupported external destination,such as BigQuery, Pub/Sub, or Splunk.
In this scenario, the external logs aren't stored in Logs Explorer but areavailable in the external destination. Using an external destination gives youmore control over the costs incurred by storing logs in Logs Explorer.
For steps detailing how to control how Cloud Logging routes logs, seeConfigure and manage sinks.To capture all Dataflow logs in an external destination,in theChoose logs to include in sink panel, in theBuild inclusion filter field,enter the following filter expression:
resource.type="dataflow_step"To find log entries that you routed from Cloud Logging to supporteddestinations, seeView logs in sink destinations.
Track Dataflow log messages by severity
Exclusion filters do not apply touser-defined logs-based metrics.These metrics count the number of log entries that match a given filter orrecord particular values within the matching log entries.To track counts of Dataflow log messages based on severity, you cancreate a logs-based metric for the Dataflow logs. The logsare tracked even when the log messages are excluded from ingestion.
You're billed for user-defined logs-based metrics. For pricing information, seeChargeable metrics.
To configure user-defined logs-based metrics, seeCreate a counter metric.To track the Dataflow logs, in theFilter selection section,in theBuild filter box, enter the following text:
resource.type="dataflow_step"Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2026-02-19 UTC.