Deploy log streaming from Google Cloud to Datadog

Last reviewed 2024-12-10 UTC

This document describes how you deploy a Cloud Logging log sink and aDataflow pipeline to stream logs from Google Cloud to Datadog. Itassumes that you're familiar with the reference architecture inStream logs from Google Cloud to Datadog.

These instructions are intended for IT professionals who want to stream logsfrom Google Cloud to Datadog. Although it's not required, havingexperience with the following Google products is useful for deployingthis architecture:

  • Dataflow pipelines
  • Pub/Sub
  • Cloud Logging
  • Identity and Access Management (IAM)
  • Cloud Storage

You must have a Datadog account to complete this deployment. However, you don'tneed any familiarity with Datadog Log Management.

Architecture

The following diagram shows the architecture that's described in this document.This diagram demonstrates how log files that are generated by Google Cloudare ingested by Datadog and shown to Datadog users.Click the diagram to enlarge it.

Log file ingestion from Google Cloud to Datadog Log Management.

As shown in the preceding diagram, the following events occur:

  1. Cloud Logging collects log files from a Google Cloud project into adesignated Cloud Logging log sink and then forwards them to aPub/Sub topic.
  2. A Dataflow pipeline pulls the logs from thePub/Sub topic, batches them, compresses them into a payload,and then delivers them to Datadog.
    1. If there's a delivery failure, a secondary Dataflowpipeline sends messages from a dead-letter topic back to the primarylog-forwarding topic to be redelivered.
  3. The logs arrive in Datadog for further analysis and monitoring.

For more information, see theArchitecture section of the reference architecture.

Objectives

  • Create the secure networking infrastructure.
  • Create the logging and Pub/Sub infrastructure.
  • Create the credentials and storage infrastructure.
  • Create the Dataflow infrastructure.
  • Validate that Datadog Log Explorer received logs.
  • Manage delivery errors.

Costs

In this document, you use the following billable components of Google Cloud:

To generate a cost estimate based on your projected usage, use thepricing calculator.

New Google Cloud users might be eligible for afree trial.

You also use the following billable components for Datadog:

Before you begin

  1. Sign in to your Google Cloud account. If you're new to Google Cloud, create an account to evaluate how our products perform in real-world scenarios. New customers also get $300 in free credits to run, test, and deploy workloads.
  2. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  3. Verify that billing is enabled for your Google Cloud project.

  4. Enable the Cloud Monitoring, Secret Manager, Compute Engine, Pub/Sub, Logging, and Dataflow APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

  5. In the Google Cloud console, on the project selector page, select or create a Google Cloud project.

    Roles required to select or create a project

    • Select a project: Selecting a project doesn't require a specific IAM role—you can select any project that you've been granted a role on.
    • Create a project: To create a project, you need the Project Creator role (roles/resourcemanager.projectCreator), which contains theresourcemanager.projects.create permission.Learn how to grant roles.
    Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.

    Go to project selector

  6. Verify that billing is enabled for your Google Cloud project.

  7. Enable the Cloud Monitoring, Secret Manager, Compute Engine, Pub/Sub, Logging, and Dataflow APIs.

    Roles required to enable APIs

    To enable APIs, you need the Service Usage Admin IAM role (roles/serviceusage.serviceUsageAdmin), which contains theserviceusage.services.enable permission.Learn how to grant roles.

    Enable the APIs

IAM role requirements

  • Make sure that you have the following role or roles on the project: Compute > Compute Network Admin, Compute > Compute Security Admin, Dataflow > Dataflow Admin, Dataflow > Dataflow Worker, IAM > Project IAM Admin, IAM > Service Account Admin, IAM > Service Account User, Logging > Logs Configuration Writer, Logging > Logs Viewer, Pub/Sub > Pub/Sub Admin, Secret Manager > Secret Manager Admin, Storage > Storage Admin

    Check for the roles

    1. In the Google Cloud console, go to theIAM page.

      Go to IAM
    2. Select the project.
    3. In thePrincipal column, find all rows that identify you or a group that you're included in. To learn which groups you're included in, contact your administrator.

    4. For all rows that specify or include you, check theRole column to see whether the list of roles includes the required roles.

    Grant the roles

    1. In the Google Cloud console, go to theIAM page.

      Go to IAM
    2. Select the project.
    3. ClickGrant access.
    4. In theNew principals field, enter your user identifier. This is typically the email address for a Google Account.

    5. ClickSelect a role, then search for the role.
    6. To grant additional roles, clickAdd another role and add each additional role.
    7. ClickSave.

Create network infrastructure

This section describes how to create your network infrastructure to support thedeployment of a Cloud Logging log sink and a Dataflow pipeline tostream logs from Google Cloud to Datadog.

Create a Virtual Private Cloud (VPC) network and subnet

To host the Dataflow pipeline worker VMs, create aVirtual Private Cloud (VPC) network and subnet:

  1. In the Google Cloud console, go to theVPC networks page.

    Go to VPC networks

  2. ClickCreate VPC network.

  3. In theName field, provide a name for the network.

  4. In theSubnets section, provide a name, region, and IP address range forthe subnetwork. The size of the IP address range might vary based on yourenvironment. A subnet mask of length/24 is sufficient for most use cases.

  5. In thePrivate Google Access section, selectOn.

  6. ClickDone and then clickCreate.

Create a VPC firewall rule

To restrict traffic to the Dataflow VMs, create aVPC firewall rule:

  1. In the Google Cloud console, go to theCreate a firewall rule page.

    Go to Create a firewall rule

  2. In theName field, provide a name for the rule.

  3. In theDescription field, explain what the rule does.

  4. In theNetwork list, select the network for yourDataflow VMs.

  5. In thePriority field, specify the order in which this rule is applied.Set thePriority to0.

    Rules with lower numbers get prioritized first. The default value for thisfield is1,000.

  6. In theDirection of traffic section, selectIngress.

  7. In theAction on match section, selectAllow.

Create targets, source tags, protocols, and ports

  1. In the Google Cloud console, go to theCreate a firewall rule page.

    Go to Create a firewall rule

  2. Find theTargets list and selectSpecified target tags.

  3. In theTarget tags field, enterdataflow.

  4. In theSource filter list, selectSource tags.

  5. In theSource tags field, enterdataflow.

  6. In theProtocols and Ports section complete the following tasks:

    1. SelectSpecified protocols and ports.
    2. Select theTCP checkbox.
    3. In thePorts field, enter12345-12346.
  7. ClickCreate.

Create a Cloud NAT gateway

To help enable secure outbound connections between Google Cloud andDatadog, create aCloud NAT gateway.

  1. In the Google Cloud console, go to theCloud NAT page.

    Go to Cloud NAT

  2. In the Cloud NAT page, clickCreate Cloud NAT gateway.

  3. In theGateway name field, provide a name for the gateway.

  4. In theNAT type section, selectPublic.

  5. In theSelect Cloud Router section, in theNetwork list, selectyour network from the list of available networks.

  6. In theRegion list, select the region that contains yourCloud Router.

  7. In theCloud Router list, select orcreate a new router in the same network and region.

  8. In theCloud NAT mapping section, in theCloud NATIP addresses list, selectAutomatic.

  9. ClickCreate.

Create logging and Pub/Sub infrastructure

Create Pub/Sub topics and subscriptions to receive and forwardyour logs, and to handle any delivery failures.

  1. In the Google Cloud console, go to theCreate a Pub/Sub topic page.

    Go to Create a Pub/Sub topic

  2. In theTopic ID field, provide a name for the topic.

    1. Leave theAdd a default subscription checkbox selected.
  3. ClickCreate.

  4. To handle any log messages that are rejected by the Datadog API, create anadditional topic and default subscription. To create anadditional topic and default subscription, repeat the stepsin this procedure.

    The additional topic is used within the Datadog Dataflowtemplate as part of the path configuration for theoutputDeadletterTopictemplate parameter.

Route the logs to Pub/Sub

This deployment describes how to create a project-levelCloud Logging log sink.However, you can also create anorganization-level aggregated sink that combines logs from multiple projects. Set theincludeChildren parameteron the organization-level sink:

  1. In the Google Cloud console, go to theCreate logs routing sink page.

    Go to Create logs routing sink

  2. In theSink details section, in theSink name field, enter a name.

  3. Optional: In theSink description field, explain the purpose of the logsink.

  4. ClickNext.

  5. In theSink destination section, in theSelect sink service list,selectCloud Pub/Sub topic.

  6. In theSelect a Cloud Pub/Sub topic list, select theinput topic that you just created.

  7. ClickNext.

  8. Optional: In theChoose logs to include in sink section, in theBuild inclusion filter field, specify which logs to include in the sink by entering your logging queries.

    For example, to include only 10% of the logs with a severity level ofINFO, create an inclusion filter withseverity=INFO AND sample(insertId, 0.1).

    For more information, seeLogging query language.

    Note: If you don't set a filter, all logs for all resources in your project,including audit logs, are routed to the destination that you create in thissection.
  9. ClickNext.

  10. Optional: In theChoose logs to filter out of sink (optional) section,create logging queries to specify which logs to exclude from the sink:

    1. To build an exclusion filter, clickAdd exclusion.
    2. In theExclusion filter name field, enter a name.
    3. In theBuild an exclusion filter field, enter afilter expression that matches the log entries that you want to exclude. You can also use thesample function to select a portion of the log entries to exclude.

      To create the sink with your new exclusion filter turned off, clickDisable after you enter the expression. You can update the sink later to enable the filter.

  11. ClickCreate sink.

Identify writer-identity values

  1. In the Google Cloud console, go to theLog Routerpage.

    Go to Log Router

  2. In theLog Router Sinks section, find your log sink and then clickMore actions.

  3. ClickView sink details.

  4. In theWriter identity row, next toserviceAccount, copy the serviceaccount ID. You use the copied service account ID value in the next section.

Add a principal value

  1. Go to thePub/Sub Topics page.

    Go to Pub/Sub Topics

  2. Select your input topic.

  3. ClickShow info panel.

  4. On theInfo Panel, in thePermissions tab, clickAdd principal.

  5. In theAdd principals section, in theNew principals field, paste theWriter identity serviceaccount ID that you copied in the previous section.

  6. In theAssign roles section, in theSelect a role list, point toPub/Sub and clickPub/Sub Publisher.

  7. ClickSave.

Create credentials and storage infrastructure

To store yourDatadog API key value, create a secret inSecret Manager.This API key is used by the Dataflow pipeline to forwardlogs to Datadog.

  1. In the Google Cloud console, go to theCreate secret page.

    Go to Create secret

  2. In theName field, provide a name for your secret—for example,my_secret. A secret name can contain uppercase and lowercase letters,numerals, hyphens, and underscores. The maximum allowed length for a name is255 characters.

  3. In theSecret value section, in theSecret value field, paste your Datadog API key value.

    You can find the Datadog API key value on the DatadogOrganization Settings page.

  4. ClickCreate secret.

Create storage infrastructure

To stage temporary files for the Dataflow pipeline, create aCloud Storage bucket withUniform bucket-level access enabled:

  1. In the Google Cloud console, go to theCreate a bucket page.

    Go to Create a bucket

  2. In theGet Started section, enter a globally unique, permanentname for the bucket.

  3. ClickContinue.

  4. In theChoose where to store your data section, selectRegion,select a region for your bucket, and then clickContinue.

  5. In theChoose a storage class for your data section,selectStandard, and then clickContinue.

  6. In theChoose how to control access to objects section,find theAccess control section, selectUniform,and then clickContinue.

  7. Optional: In theChoose how to protect object data section, configureadditional security settings.

  8. ClickCreate. If prompted, leave theEnforce public access prevention on this bucket item selected.

Create Dataflow infrastructure

In this section you create a custom Dataflow workerservice account.This account should follow theprinciple of least privilege.

The default behavior for Dataflow pipeline workers isto use your project'sCompute Engine default service account,which grants permissions to all resources in the project. If you are forwardinglogs from a production environment, create a custom worker service account withonly the necessary roles and permissions. Assign this service account to yourDataflow pipeline workers.

The following IAM roles are required for theDataflow worker service account that you create in this section.The service account uses these IAM roles to interact with yourGoogle Cloud resources and to forward your logs to Datadogthrough Dataflow.

RoleEffect
  • Dataflow Admin
  • Dataflow Worker
Allows creating, running, and examining Dataflow jobs. For more information, seeRoles in the Dataflow access control documentation.
  • Pub/Sub Publisher
  • Pub/Sub Subscriber
  • Pub/Sub Viewer
Allows viewing subscriptions and topics, consuming messages from a subscription, and publishing messages to a topic. For more information, seeRoles in the Pub/Sub access control documentation.
  • Secret Manager Secret Accessor
Allows accessing the payload of secrets. For more information, seeAccess control with IAM.
  • Storage Object Admin
Allows listing, creating, viewing, and deleting objects. For more information, see IAM roles for Cloud Storage.

Create a Dataflow worker service account

  1. In the Google Cloud console, go to theService Accounts page.

    Go to Service Accounts

  2. In theSelect a recent project section, select your project.

  3. On theService Accounts page, clickCreate service account.

  4. In theService account details section, in theService account name field, enter a name.

  5. ClickCreate and continue.

  6. In theGrant this service account access to project section, add thefollowing project-level roles to the service account:

    • Dataflow Admin
    • Dataflow Worker
  7. ClickDone. TheService Accounts page appears.

  8. On theService Accounts page, click your service account.

  9. In theService account details section, copy theEmail value. Youuse this value in the next section. The system uses the value to configureaccess to your Google Cloud resources, so that the service account caninteract with them.

Provide access to the Dataflow worker service account

To view and consume messages from the Pub/Sub input subscription,provide access to the Dataflow worker service account:

  1. In the Google Cloud console, go to thePub/Sub Subscriptions page.

    Go to Pub/Sub Subscriptions

  2. Select the checkbox next to your input subscription.

  3. ClickShow info panel.

  4. In thePermissions tab, clickAdd principal.

  5. In theAdd principals section, in theNew principals field, pastethe email of the service account that you created earlier.

  6. In theAssign roles section, assign the following resource-level rolesto the service account:

    • Pub/Sub Subscriber
    • Pub/Sub Viewer
  7. ClickSave.

Handle failed messages

To handle failed messages, you configure the Dataflow workerservice account to send any failed messages to a dead-letter topic. To sendthe messages back to the primary input topic after any issues are resolved,the service account needs to view and consume messages from the dead-lettersubscription.

Grant access to the service account

  1. In the Google Cloud console, go to thePub/Sub Topics page.

    Go to Pub/Sub Topics

  2. Select the checklist next to your input topic.

  3. ClickShow info panel.

  4. In thePermissions tab, clickAdd principal.

  5. In theAdd principals section, in theNew principals field, pastethe email of the service account that you created earlier.

  6. In theAssign roles section, assign the following resource-level roleto the service account:

    • Pub/Sub Publisher
  7. ClickSave.

Create a dead-letter topic

  1. In the Google Cloud console, go to thePub/Sub Topics page.

    Go to Pub/Sub Topics

  2. Select the checkbox next to your dead-letter topic.

  3. ClickShow info panel.

  4. In thePermissions tab, clickAdd principal.

  5. In theAdd principals section, in theNew principals field, pastethe email of the service account that you created earlier.

  6. In theAssign roles section, assign the following resource-level roleto the service account:

    • Pub/Sub Publisher
  7. ClickSave.

Create a dead-letter subscription

  1. In the Google Cloud console, go to thePub/Sub Subscriptions page.

    Go to Pub/Sub Subscriptions

  2. Select the checkbox next to your dead-letter subscription.

  3. ClickShow info panel.

  4. In thePermissions tab, clickAdd principal.

  5. In theAdd principals section, in theNew principals field, pastethe email of the service account that you created earlier.

  6. In theAssign roles section, assign the following resource-level rolesto the service account:

    • Pub/Sub Subscriber
    • Pub/Sub Viewer
  7. ClickSave.

Enable the Dataflow worker service account

To access the Datadog API key secret in Secret Manager, you mustfirst enable the Dataflow worker service account. Doing so letsthe Dataflow worker service account access the Datadog API keysecret.

  1. In the Google Cloud console, go to theSecret Manager page.

    Go to Secret Manager

  2. Select the checkbox next to your secret.

  3. ClickShow info panel,

  4. In thePermissions tab, clickAdd principal.

  5. In theAdd principals section, in theNew principals field, pastethe email of the service account that you created earlier.

  6. In theAssign roles section, assign the following resource-level roleto the service account:

    • Secret Manager Secret Accessor
  7. ClickSave.

Stage files to the Cloud Storage bucket

Give the Dataflow worker service account access to read and writethe Dataflow job's staging files to the Cloud Storagebucket:

  1. In the Google Cloud console, go to theBuckets page.

    Go to Buckets

  2. Select the checklist next to your bucket.

  3. ClickPermissions.

  4. In theAdd principals section, in theNew principals field, pastethe email of the service account that you created earlier.

  5. In theAssign roles section, assign the following roleto the service account:

    • Storage Object Admin
  6. ClickSave.

Export logs with the Pub/Sub-to-Datadog pipeline

Provide a baseline configuration for running the Pub/Sub toDatadog pipeline in a secure network with a custom Dataflowworker service account. If you expect to stream a high volume of logs, you canalso configure the following parameters and features:

  • batchCount: The number of messages in each batched request to Datadog(from 10 to 1,000 messages, with a default value of100). To ensure atimely and consistent flow of logs, a batch is sent at least every twoseconds.
  • parallelism: The number of requests that are being sent to Datadog in parallel, with a default value of1 (no parallelism).
  • Horizontal Autoscaling: Enabled by default for streaming jobs that useStreaming Engine. For more information, seeStreaming autoscaling.
  • User-defined functions:Optional JavaScript functions that you configure to act as extensions to thetemplate (not enabled by default).

For the Dataflow job'sURL parameter, ensure that youselect the Datadog logs API URL that corresponds to yourDatadog site:

SiteLogs API URL
US1https://http-intake.logs.datadoghq.com
US3https://http-intake.logs.us3.datadoghq.com
US5https://http-intake.logs.us5.datadoghq.com
EUhttps://http-intake.logs.datadoghq.eu
AP1https://http-intake.logs.ap1.datadoghq.com
US1-FEDhttps://http-intake.logs.ddog-gov.com

Create your Dataflow job

  1. In the Google Cloud console, go to theCreate job from template page.

    Go to Create job from template

  2. In theJob name field, name the project.

  3. From theRegional endpoint list, select a Dataflowendpoint.

  4. In theDataflow template list,selectPub/Sub to Datadog.TheRequired Parameters section appears.

  5. Configure theRequired Parameters section:

    1. In thePub/Sub input subscription list, select theinput subscription.
    2. In theDatadog Logs API URL field, enter the URL that corresponds toyour Datadog site.
    3. In theOutput deadletter Pub/Sub topic list, selectthe topic that you created to receive message failures.
  6. Configure theStreaming Engine section:

    1. In theTemporary location field, specify a path for temporary files in the storage bucket that you created for that purpose.
  7. Configure theOptional Parameters section:

    1. In theGoogle Cloud Secret Manager ID field,enter the resource name of the secret that you configured with yourDatadog API key value.

Configure your credentials, service account, and networking parameters

  1. In theSource of the API key passed field, selectSECRET_MANAGER.
  2. In theWorker region list, select the region where you created yourcustom VPC and subnet.
  3. In theService account email list, select the customDataflow worker service account that you created forthat purpose.
  4. In theWorker IP Address Configuration list, selectPrivate.
  5. In theSubnetwork field, specify the private subnetwork that youcreated for the Dataflow worker VMs.

    For more information, seeGuidelines for specifying a subnetwork parameter for Shared VPC.

  6. Optional: Customize other settings.

  7. ClickRun job. The Dataflow service allocatesresources to run the pipeline.

Validate that Datadog Log Explorer received logs

Open theDatadog Log Explorer,and ensure that the timeframe is expanded to encompass the timestamp of thelogs. To validate that Datadog Log Explorer received logs, search for logs withthegcp.dataflow.step source attribute, or any other log attribute.

  • Validate that Datadog Log Explorer received logs from Google Cloud:

    Source:gcp.dataflow.step

    The output will display all of the Datadog log messages that you forwardedfrom the dead-letter topic to the primary log forwarding pipeline.

For more information, seeSearch logs in the Datadog documentation.

Manage delivery errors

Log file delivery from the Dataflow pipeline that streamsGoogle Cloud logs to Datadog can fail occasionally. Delivery errors can becaused by:

  • 4xx errors from the Datadog logs endpoint (related to authentication ornetwork issues).
  • 5xx errors caused by server issues at the destination.

Manage401 and403 errors

If you encounter a401 error or a403 error, you must replace theprimary log-forwarding job with a replacement job that has a valid API keyvalue. You must then clear the messages generated by those errors from thedead-letter topic. To clear the error messages, follow the steps in theTroubleshoot failed messages section.

For more information about replacing the primary log-forwarding job with areplacement job, seeLaunch a replacement job.

Manage other4xx errors

To resolve all other4xx errors, follow the steps in the Troubleshootfailed messages section.

Manage5xx errors

For5xx errors, delivery is automatically retried withexponential backoff,for a maximum of 15 minutes. This automatic process might not resolve all errors.To clear any remaining5xx errors, follow the steps in theTroubleshoot failed messages section.

Troubleshoot failed messages

When you see failed messages in the dead-letter topic, examine them. To resolvethe errors, and to forward the messages from the dead-letter topic to theprimary log-forwarding pipeline,complete all of the following subsections in order.

Review your dead-letter subscription

  1. In the Google Cloud console, go to thePub/Sub Subscriptions page.

    Go to Pub/Sub Subscriptions

  2. Click the subscription ID of the dead-letter subscription
    that you created.

  3. Click theMessages tab.

  4. To view the messages, leave theEnable ack messages checkbox cleared andclickPull.

  5. Inspect the failed messages and resolve any issues.

Reprocess dead-letter messages

To reprocess dead-letter messages, first create a Dataflow joband then configure parameters.

Create your Dataflow job

  1. In the Google Cloud console, go to theCreate job from template page.

    Go to Create job from template

  2. Give the job a name and specify the regional endpoint.

Configure your messaging and storage parameters

  1. In theCreate job from template page, in theDataflow template list, select thePub/Sub to Pub/Sub template.
  2. In theSource section, in thePub/Sub input subscription list, select your dead-letter subscription.
  3. In theTarget section, in theOutput Pub/Sub topic list, select the primary input topic.
  4. In theStreaming Engine section, in theTemporary location field, specify a path and filename prefix for temporary files in the storage bucketthat you created for that purpose. For example,gs://my-bucket/temp.

Configure your networking and service account parameters

  1. In theCreate job from template page, find theWorker region listand select the region where you created your custom VPCand subnet.
  2. In theService Account email list, select the customDataflow worker service account email address that youcreated for that purpose.
  3. In theWorker IP Address Configuration list, selectPrivate.
  4. In theSubnetwork field, specify the private subnetwork that you createdfor the Dataflow worker VMs.

    For more information, seeGuidelines for specifying a subnetwork parameter for Shared VPC.

  5. Optional: Customize other settings.

  6. ClickRun job.

Confirm the dead-letter subscription is empty

Confirming that the dead-letter subscription is empty helps ensure that you haveforwarded all messages from that Pub/Sub subscription to theprimary input topic.

  1. In the Google Cloud console, go to thePub/SubSubscriptions page.

    Go to Pub/Sub Subscriptions

  2. Click the subscription ID of the dead-letter subscription that you created.

  3. Click theMessages tab.

  4. Confirm that there are no more unacknowledged messages through thePub/Sub subscription metrics.

For more information, seeMonitor message backlog.

Drain the backup Dataflow job

After you have resolved the errors, and the messages in the dead-letter topichave returned to the log-forwarding pipeline, follow these steps to stop runningthe Pub/Sub to Pub/Sub template.

Draining the backup Dataflow job ensures that theDataflow service finishes processing the buffered data while alsoblocking the ingestion of new data.

  1. In the Google Cloud console, go to theDataflow jobs page.

    Go to Dataflow jobs

  2. Select the job that you want to stop. TheStop Jobs window appears.To stop a job, the status of the job must berunning.

  3. SelectDrain.

  4. ClickStop job.

Clean up

If you don't plan to continue using the Google Cloud and Datadog resourcesdeployed in this reference architecture, delete them to avoid incurringadditional costs. There are no Datadog resources for you to delete.

Delete the project

    Caution: Deleting a project has the following effects:
    • Everything in the project is deleted. If you used an existing project for the tasks in this document, when you delete it, you also delete any other work you've done in the project.
    • Custom project IDs are lost. When you created this project, you might have created a custom project ID that you want to use in the future. To preserve the URLs that use the project ID, such as anappspot.com URL, delete selected resources inside the project instead of deleting the whole project.

    If you plan to explore multiple architectures, tutorials, or quickstarts, reusing projects can help you avoid exceeding project quota limits.

  1. In the Google Cloud console, go to theManage resources page.

    Go to Manage resources

  2. In the project list, select the project that you want to delete, and then clickDelete.
  3. In the dialog, type the project ID, and then clickShut down to delete the project.

What's next

Contributors

Authors:

  • Ashraf Hanafy | Senior Software Engineer for Google Cloud Integrations, Datadog
  • Daniel Trujillo | Engineering Manager, Google Cloud Integrations, Datadog
  • Bryce Eadie | Technical Writer, Datadog
  • Sriram Raman | Senior Product Manager, Google Cloud Integrations, Datadog

Other contributors:

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2024-12-10 UTC.