GCP Dataflow templates
In this tutorial, you’ll learn how to ship logs directly from the Google Cloud Console with the Dataflow template for analyzing GCP Audit Logs in the Elastic Stack.
You’ll learn how to:
- Export GCP audit logs through Pub/Sub topics and subscriptions.
- Ingest logs usingGoogle Dataflow and view those logs in Kibana.
Create anElastic Cloud Hosted deployment orElastic Observability Serverless project. Both include an Elasticsearch cluster for storing and searching your data and Kibana for visualizing and managing your data.
This tutorial assumes the Elasticsearch cluster is already running.
For Elastic Cloud Hosted deployments, you need yourCloud ID and anAPI Key.
To find the Cloud ID of yourdeployment, go to the deployment’sOverview page.

For Elastic Observability Serverless projects, you need yourElasticsearch endpoint URL and anAPI key.
To find your endpoint URL, selectManage next to your project, then find the Elasticsearch endpoint underApplication endpoints, cluster and component IDs. Alternatively, open your project, select the help icon, then selectConnection details.
Use Kibana tocreate a Base64-encoded API key to authenticate on your deployment.
You can optionally restrict the privileges of your API Key; otherwise they’ll be a point in time snapshot of permissions of the authenticated user. For this tutorial the data is written to thelogs-gcp.audit-default data streams.
You’ll start with installing the Elastic GCP integration to add pre-built dashboards, ingest node configurations, and other assets that help you get the most of the GCP logs you ingest.
FindIntegrations in the main menu or use theglobal search field.
Search for
gcp.
Click the Elastic Google Cloud Platform (GCP) integration to see more details about it, then clickAdd Google Cloud Platform (GCP).











