Movatterモバイル変換


[0]ホーム

URL:


Loading
  1. Elastic Docs/
  2. Solutions and use cases/
  3. Observability solution/
  4. Cloud/
  5. GCP

GCP Dataflow templates

In this tutorial, you’ll learn how to ship logs directly from the Google Cloud Console with the Dataflow template for analyzing GCP Audit Logs in the Elastic Stack.

You’ll learn how to:

  • Export GCP audit logs through Pub/Sub topics and subscriptions.
  • Ingest logs usingGoogle Dataflow and view those logs in Kibana.

Create anElastic Cloud Hosted deployment orElastic Observability Serverless project. Both include an Elasticsearch cluster for storing and searching your data and Kibana for visualizing and managing your data.

This tutorial assumes the Elasticsearch cluster is already running.

Use Kibana tocreate a Base64-encoded API key to authenticate on your deployment.

Important

You can optionally restrict the privileges of your API Key; otherwise they’ll be a point in time snapshot of permissions of the authenticated user. For this tutorial the data is written to thelogs-gcp.audit-default data streams.

You’ll start with installing the Elastic GCP integration to add pre-built dashboards, ingest node configurations, and other assets that help you get the most of the GCP logs you ingest.

  1. FindIntegrations in the main menu or use theglobal search field.

  2. Search forgcp.

    Kibana integrations
  • Click the Elastic Google Cloud Platform (GCP) integration to see more details about it, then clickAdd Google Cloud Platform (GCP).

    GCP integration
  • ClickSave integration.

  • Step 2: Create a Pub/Sub topic and subscription

    Before configuring the Dataflow template, create a Pub/Sub topic and subscription from your Google Cloud Console where you can send your logs from Google Operations Suite. There are three available filesets:audit,vpcflow,firewall. This tutorial covers theaudit fileset.

    1. Go to theLogs Router page to configure GCP to export logs to a Pub/Sub topic. Use the search bar to find the page:

      Navigate to Logs Router page

    To set up the logs routing sink, clickCreate sink. Setsink name asmonitor-gcp-audit-sink. Select theCloud Pub/Sub topic as thesink service andCreate new Cloud Pub/Sub topic namedmonitor-gcp-audit:

    Create Pub/Sub topic
    ×Create Pub/Sub topic

    Finally, underChoose logs to include in sink, addlogName:"cloudaudit.googleapis.com" (it includes all audit logs). Clickcreate sink. It will look something like the following:

    Create logs routing sink
    ×Create logs routing sink
  • Now go to thePub/Sub page to add a subscription to the topic you just created. Use the search bar to find the page:

    GCP Pub/Sub
    ×GCP Pub/Sub

    To add a subscription to themonitor-gcp-audit topic clickCreate subscription:

    Create GCP Pub/Sub Subscription
    ×Create GCP Pub/Sub Subscription

    Setmonitor-gcp-audit-sub as theSubscription ID and leave theDelivery type as pull:

    GCP Pub/Sub Subscription ID
    ×GCP Pub/Sub Subscription ID

    Finally, scroll down and clickCreate.

  • Step 3: Configure the Google Dataflow template

    After creating a Pub/Sub topic and subscription, go to theDataflow Jobs page and configure your template to use them. Use the search bar to find the page:

    GCP Dataflow Jobs
    ×GCP Dataflow Jobs

    To create a job, clickCreate Job From Template. SetJob name asauditlogs-stream and selectPub/Sub to Elasticsearch from theDataflow template dropdown menu:

    GCP Dataflow Pub/Sub to Elasticsearch
    ×GCP Dataflow Pub/Sub to Elasticsearch

    Before running the job, fill in required parameters:

    GCP Dataflow Required Parameters
    ×GCP Dataflow Required Parameters
    Note

    ForCloud Pub/Sub subscription, use the subscription you created in the previous step. Use the values you obtained earlier for the following fields:

    • For Elastic Cloud Hosted deployments, yourCloud ID
    • For Serverless, yourElasticsearch endpoint URL in the format https://hostname:[port].
    • Base64-encoded API Key.

    If you don’t have anError output topic, create one like you did in the previous step.

    After filling the required parameters, clickShow Optional Parameters and addaudit as the log type parameter.

    GCP Dataflow Optional Parameters
    ×GCP Dataflow Optional Parameters

    When you are all set, clickRun Job and wait for Dataflow to execute the template, which takes a few minutes.

    Finally, navigate to Kibana to see your logs parsed and visualized in the[Logs GCP] Audit dashboard.

    GCP audit overview dashboard

    Besides collecting audit logs from your Google Cloud Platform, you can also use Dataflow integrations to ingest data directly into Elastic fromGoogle BigQuery andGoogle Cloud Storage.

    Elastic logo

    © 2026 Elasticsearch B.V. All Rights Reserved.

    This content is available in different formats for convenience only. All original licensing terms apply.

    Elasticsearch is a trademark of Elasticsearch B.V., registered in the U.S. and in other countries. Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.


    [8]ページ先頭

    ©2009-2026 Movatter.jp