Movatterモバイル変換


[0]ホーム

URL:


NerdGraph tutorial: Stream your data to an AWS Kinesis Firehose, Azure Event Hub, or GCP Pub/Sub

With the streaming export feature available throughData Plus, you can send your data to AWS Kinesis Firehose, Azure Event Hub, or GCP Pub/Sub by creating custom rules usingNRQL to specify which data should be exported. This guide explains how to create and update streaming rules usingNerdGraph and view the existing rules. You can use theNerdGraph explorer to make these calls. Additionally, you have the option to compress the data before exporting with theExport Compression feature.

Here are some examples of how you can use the streaming export feature:

  • To populate a data lake
  • Enhance AI/ML training
  • Ensure long-term retention for compliance, legal, or security reasons

You can enable or disable streaming export rules at any time. However, be aware that streaming export only processes currently ingested data. If you disable and later re-enable the feature, any data ingested while it was off will not be exported. To export past data, you should use theHistorical data export feature.

Requirements and limits

Limits on streamed data: The amount of data you can stream per month is limited by your totalingested data per month. If your streaming data amount exceeds your ingested data amount, we may suspend your access to and use of streaming export.

Permissions-related requirements:

You must have an AWS Kinesis Firehose, Azure Event Hub, or GCP Pub/Sub set up to receive New Relic data. If you haven't already done this, you can follow our steps below forAWS,Azure orGCP Pub/Sub.

NRQL requirements:

  • Must be flat queries, with no aggregation. For example,SELECT * orSELECT column1, column2 forms are supported.
  • Applicable for anything in theWHERE clause, except subqueries.
  • Query cannot have aFACET clause,COMPARE WITH, orLOOKUP.
  • Nested queries are not supported.
  • Supportsdata types stored in NRDB, and notmetric timeslice data.

Prerequisites

Set up an AWS Kinesis Firehose

To set up streaming data export to AWS, you must first set up Amazon Kinesis Firehose. Follow these steps:

1

Create a Firehose for streaming export

Create a dedicated Firehose to stream your New Relic data to:

  1. Go toAmazon Kinesis Data Firehose.
  2. Create a delivery stream.
  3. Name the stream. You'll use this name later when registering the rule.
  4. UseDirect PUT or other sources and specify a destination compatible with New Relic's JSON event format (for example, S3, Redshift, or OpenSearch).
2

Create IAM Firehose write access policy

  1. Go to the IAM console and sign in with your user.
  2. In the left navigation, clickPolicies, and then clickCreate policy.
  3. Select the Firehose service, and then selectPutRecord andPutRecordBatch.
  4. ForResources, select the delivery stream, add ARN, and select the region of your stream.
  5. Enter your AWS account number, and then enter your desired delivery stream name in the name box.
  6. Create the policy.
3

Create IAM role for granting New Relic write access

To set up the IAM role:

  1. Navigate to the IAM and clickRoles.
  2. Create a role for an AWS account, and then selectfor another AWS account.
  3. Enter the New Relic export account ID:8886xx727xx.
  4. SelectRequire external ID and enter theaccount ID of the New Relic account you want to export from.
  5. ClickPermissions, and then select the policy you created above.
  6. Add a role name, which will be used during export registration, and provide a description.
  7. Create the role.

When you're done with these steps, you can set up your export rules usingNerdGraph.

Set up an Azure Event Hub

To set up streaming data export to Azure, you must first set up an Event Hub. Follow these steps:

Alternatively, you can follow the Azure guidehere.

1

Create an Event Hubs namespace

  1. From your Microsoft Azure account, navigate to Event Hubs.
  2. Follow the steps to create an Event Hubs namespace. We recommend enabling auto-inflate to ensure you receive all of your data.
  3. Ensure public access is enabled, as we will use a Shared Access Policy to securely authenticate with your Event Hub.
  4. Once your Event Hubs namespace is deployed, clickGo to resource.
2

Create an Event Hub

  1. In the left column, clickEvent Hubs.

  2. To create an Event Hub, click+Event Hub.

  3. Enter the desired Event Hub Name. Save this, as you need it later to create the streaming export rule.

  4. ForRetention, selectDeleteCleanup policy and desiredRetention time (hrs).

    Important

    Streaming export is currently not supported for Event Hubs withCompact retention policy.

  5. Once the Event Hub is created, clickEvent Hub.

3

Create and attach a shared access policy

  1. In the left column, go toShared access policies.
  2. Click+Add near the top of the page.
  3. Choose a name for your shared access policy.
  4. CheckSend, and clickCreate.
  5. Click the created policy, and copy theConnection string–primary key. Save this, as you need it later to authenticate and send data to your Event Hub.

When you're done with these steps, you can set up your export rules usingNerdGraph.

Set up a GCP Pub/Sub

To set up streaming data export to GCP, you must first set up a Pub/Sub. Follow these steps:

1

Create a Pub/Sub topic

  1. Form your GCP Console, navigate to the Pub/Sub page.
  2. ClickCreate topic.
  3. Enter a topic ID and clickCreate.
2

Set up permissions on Pub/Sub

  1. In the right column of the created topic, clickMore actions.
  2. SelectView permissions.
  3. ClickAdd Principal, and in the new principals box, enter the service account email provided by us:
    • US Region:us-prod-uds-streaming-export@h0c17c65df9291b526b433650e6a0a.iam.gserviceaccount.com
    • EU Region:eu-prod-uds-streaming-export@h0c17c65df9291b526b433650e6a0a.iam.gserviceaccount.com
  4. UnderAssign roles section, search for Pub/Sub Publisher and clickSave.

When you're done with these steps, you can set up your export rules usingNerdGraph.

Understand export compression

You can choose to compress data before exporting it. This feature is off by default. Compressing can help prevent exceeding your data limit and lower outbound data costs.

You can enable compression using thepayloadCompression field underruleParameters. This field can be any of the following values:

  • DISABLED: Payloads will not be compressed before being exported. If unspecified,payloadCompression will default to this value.
  • GZIP: Compress payloads with the GZIP format before exporting.

GZIP is the only compression format currently available, though we may choose to make more formats available in the future.

When compression is enabled on an existing AWS export rule, the next message from Kinesis Firehose may contain both compressed and uncompressed data. This is due to buffering within Kinesis Firehose. To avoid this, you can temporarily disable the export rule before enabling compression, or create a new Kinesis Firehose stream for the compressed data alone to flow through.

If you do encounter this issue and you're exporting to S3 or another file storage system, you can view the compressed part of the data by following these steps:

  1. Manually download the object.
  2. Separate the object into two separate files by copying the compressed data into a new file.
  3. Decompress the new, compressed-only data file.

Once you have the compressed data, you can re-upload it to S3 (or whatever other service you're using) and delete the old file.

Please be aware that in S3 or another file storage system, objects may consist of multiple GZIP-encoded payloads that are appended consecutively. Therefore, your decompression library should have the capability to handle such concatenated GZIP payloads.

Automatic decompression in AWS

Once your data has arrived in AWS, you may want options to automatically decompress it. If you're streaming that data to an S3 bucket, there are two ways to enable automatic decompression:

Automatic decompression in Azure

If you're exporting data to Azure, it's possible to view decompressed versions of the objects stored in your event hub using aStream Analytics Job. To do so, follow these steps:

  1. Followthis guide up to step 16.

    • On step 13, you may choose to use the same event hub as the output without breaking anything, but this is not recommended if you intend to proceed to step 17 and start the job, as this approach has not been tested.
  2. In the left pane of your streaming analytics job, clickInputs, then click the input you set up.

  3. Scroll down to the bottom of the pane that appears on the right, and configure the input with these settings:

    • Event serialization format: JSON
    • Encoding: UTF-8
    • Event compression type: GZip
  4. ClickSave at the bottom of the pane.

  5. ClickQuery on the side of the screen.Using theInput preview tab, you should now be able to query the event hub from this screen.

Automatic decompression in GCP

In GCP Cloud Storage, objects will automatically decompress when downloaded if the metadata is set to Content-Encoding: gzip. For more details, check theGCP documentation.

On this page

Was this doc helpful?

Edit this doc

[8]ページ先頭

©2009-2025 Movatter.jp