Exporting data from Analytics

This pageapplies toApigee andApigee hybrid.

View Apigee Edge documentation.

Apigee Analytics collects and analyzes a broad spectrum of data that flows across your APIs and provides visualization tools, including interactive dashboards, custom reports, and other tools that identify trends in API proxy performance.

Now, you can unlock this rich content by exporting analytics data from Apigee Analytics to your own data repository, such as Google Cloud Storage or Google BigQuery. You can then take advantage of the powerful query and machine learning capabilities offered by Google BigQuery and TensorFlow to perform your own data analysis. You can also combine the exported analytics data with other data, such as web logs, to gain new insights into your users, APIs, and applications.

Note: Free and trial accounts cannot export analytics data. For more information, seeApigee pricing plans.

If you are a Pay-as-you-go customer, you must enable the Apigee API Analytics add-on in your eligible environments in order to use this API. If the Apigee API Analytics add-on is not enabled, or was disabled more than 30 days before using the API, you will see an error that you are unable to access the API. For more information, see Manage the Apigee API Analytics add-on.

What export data formats are supported?

Export analytics data to one of the following formats:

  • Comma-separated values (CSV)

    The default delimiter is a comma (,) character. Supported delimiter characters include comma (,), pipe (|), and tab (\t). Configure the value using thecsvDelimiter property, as described inExport request property reference .

  • JSON (newline delimited)

    Allows the newline character to be used as a delimiter.

The exported data includes all the analytics metrics and dimensions built into Apigee, andany custom analytics data that you add. For a description of the exported data,seeAnalytics metrics, dimensions, and filters reference.

You can export analytics data to the following data repositories:

Steps to export your analytics data

The following steps summarize the process used to export your analytics data:

Note:You must haveApigee Analytics Editor andApigee Org Administrator permissions to manage datastores and export data. SeeUnderstanding roles.

  1. Configure your data repository (Cloud Storage or BigQuery) for data export. You must ensure that your data repository has been configured correctly, and that the Apigee Service Agent service account used to write data to the data repository has the correct permissions.
  2. Create a datastore that defines the properties of the data repository (Cloud Storage or BigQuery) where you export your data.
  3. Export your analytics data. The data export runs asynchronously in the background.
  4. View the status of the export request to determine when the export completes.
  5. When the export is complete, access the exported data in your data repository.

The following sections describe these steps in more detail.

Configuring your data repository

Configure Cloud Storage or BigQuery to enable access by analytics data export.

Configuring Google Cloud Storage

Before you can export data to Google Cloud Storage you need to do the following:

Configuring Google BigQuery

Note: Analytics data for the US or EU is stored in either the US or EU multi-region. If you export the data to BigQuery, theData location for the BigQuery dataset must be the same multi-region:

You cannot export the data directly to an individual region in the US or EU. See Exporting data to BigQuery for an individual region in the US or EU for a workaround.

Before you can export data to Google BigQuery:

Exporting data to BigQuery for an individual region in the US or EU

Since analytics data for the US or EU is stored in either the US or EU multi-region, you cannot export the data directly to an individual US or EU region in BigQuery. As a workaround, you can first export the data to Google Cloud Storage, and then transfer it to BigQuery as follows:

  1. Create a Cloud Storage bucket, and setLocation to the individual region in the US or EU that you want associated to your data in BigQuery.
  2. Create aCloud Storage datastore, using the storage bucket created in the previous step.
  3. Export the data to Cloud Storage. See Example 1: Export data to Cloud Storage below for an example.
  4. Load the data to BigQuery, as described in the following sections:

Managing datastores

Thedatastore defines the connection to your export data repository (Cloud Storage, BigQuery).

The following sections describe how to create and manage your datastores. Before you create a datastore, it is recommended that youtest the data repository configuration.

Testing the data repository configuration

When you create the data repository, Apigee does not test or validate that the configuration is valid. That means you can create the datastore (in the next step) and not detect any errors until you run your first data export.

Because a data export process can take a long time to execute, you can detect errors sooner by testing the data repository configuration to ensure it is valid, and fixing any errors before creating the datastore.

To test the data repository configuration, issue a POST request to the/organizations/{org}/analytics/datastores:test API. Pass the following information in the request body:

Note: This is the same information that you will pass when youcreate a datastore.

For example, the following tests a Cloud Storage data repository configuration:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores:test" \  -X POST \  -H "Content-type:application/json" \  -H "Authorization: Bearer $TOKEN" \  -d \  '{    "displayName": "My Cloud Storage datastore",    "targetType": "gcs",    "datastoreConfig": {      "projectId": "my-project",      "bucketName": "my-bucket",      "path": "my/analytics/path"    }  }'

The following provides an example of the response if the test is successful:

{  "state": "completed",}

The following provides an example of the response if the test failed:

{  "state": "failed",  "error": "<error message>"}

In this case, address the issues raised in the error message and re-test the data repository configuration. After a successful test, create the datastore, as described in the next section.

Creating a datastore

To create a datastore, issue a POST request to the/organizations/{org}/analytics/datastores API. Pass the following information in the request body:

Examples are provided below for each datastore type.

The following provides an example of the response for a Cloud Storage data repository:

{    "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b",    "displayName": "My Cloud Storage datastore",    "org": "myorg",    "targetType": "gcs",    "createTime": "1535411583949",    "lastUpdateTime": "1535411634291",    "datastoreConfig": {          "projectId": "my-project",          "bucketName": "my-bucket",          "path": "my/analytics/path"    }}

Use the URL returned in theself property to view the datastore details, as described inViewing the details of a datastore.

For more information, see theCreate data store API.

Example 1: Create a Cloud Storage datastore

The following request creates a Cloud Storage datastore:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores" \  -X POST \  -H "Content-type:application/json" \  -H "Authorization: Bearer $TOKEN" \  -d \  '{    "displayName": "My Cloud Storage datastore",    "targetType": "gcs",    "datastoreConfig": {      "projectId": "my-project",      "bucketName": "my-bucket",      "path": "my/analytics/path"    }  }'

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

Example 2: Create a BigQuery datastore

The following request creates a BigQuery datastore:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores" \  -X POST \  -H "Content-type:application/json" \  -H "Authorization: Bearer $TOKEN" \  -d \  '{    "displayName": "My BigQuery datastore",    "targetType": "bigquery",    "datastoreConfig": {      "projectId": "my-project",      "datasetName": "mybigquery",      "tablePrefix": "bqprefix"    }  }'

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

Viewing all datastores

To view all datastores for your organization, issue a GET request to the/organizations/{org}/analytics/datastores API.

For example:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores" \  -X GET \  -H "Authorization: Bearer $TOKEN"

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

The following provides an example of the response:

{  "datastores": [  {    "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b",    "displayName": "My Cloud Storage datastore",    "org": "myorg",    "targetType": "gcs",    "createTime": "1535411583949",    "lastUpdateTime": "1535411634291",    "datastoreConfig": {          "projectId": "my-project",          "bucketName": "my-bucket",          "path": "my/analytics/path"    }  },  {    "self": "/organizations/myorg/analytics/datastores/g8c3f0mk-1f78-8837-9c67-k222b60ce30b",    "displayName": "My BigQuery datastore",    "org": "myorg",    "targetType": "bigquery",    "createTime": "1535411583949",    "lastUpdateTime": "1535411634291",    "datastoreConfig": {      "projectId": "my-project",      "datasetName": "mybigquery",      "tablePrefix": "bqprefix"    }  }  ]}

For more information, see theList data stores API.

Viewing the details for a datastore

To view the details for a datastore, issue a GET request to the/organizations/{org}/analytics/datastores/{datastore} API.

For example:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b" \  -X GET \  -H "Authorization: Bearer $TOKEN"

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

The following provides an example of the response for a Cloud Storage datastore:

{    "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b",    "displayName": "My Cloud Storage datastore",    "org": "myorg",    "targetType": "gcs",    "createTime": "1535411583949",    "lastUpdateTime": "1535411634291",    "datastoreConfig": {          "projectId": "my-project",          "bucketName": "my-bucket",          "path": "my/analytics/path"    }}

For more information, seeGet data store API.

Modifying a datastore

To modify a datastore, issue a PUT request to the/organizations/{org}/analytics/datastores/{datastore} API. Pass all or a subset of the following information in the request body:

For example, to update a Cloud Storage datastore:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b" \  -X PUT \  -H "Content-type:application/json" \  -H "Authorization: Bearer $TOKEN" \  -d \  '{    "displayName": "My Cloud Storage datastore",    "datastoreConfig": {      "projectId": "my-project",      "bucketName": "my-bucket",      "path": "my/analytics/path"    }  }'

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

The following provides an example of the response for a Cloud Storage datastore:

{    "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b",    "displayName": "My Cloud Storage datastore",    "org": "myorg",    "targetType": "gcs",    "createTime": "1535411583949",    "lastUpdateTime": "1535411634291",    "datastoreConfig": {          "projectId": "my-project",          "bucketName": "my-bucket",          "path": "my/analytics/path"    }}

For more information, see theUpdate data store API.

Deleting a datastore

To delete a datastore, issue a DELETE request to the/organizations/{org}/analytics/datastores/{datastore} API.

For example:

curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b" \  -X DELETE \  -H "Authorization: Bearer $TOKEN"

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

The following provides an example of the response:

{}

For more information, see theDelete data store API.

Exporting analytics data

To export analytics data, issue a POST request to the/organizations/{org}/environments/{env}/analytics/exports API. Pass the following information in the request body:

  • Name and description of the export request
  • Date range of exported data (value can only span one day)
  • Format of exported data
  • Datastore name
Note: You must be an organization administrator to use the API.

Examples of export requests are provided below. For a complete description of the request body properties, seeExport request property reference.

The response from the POST is in the form:

{"self":"/organizations/myorg/environments/test/analytics/exports/a7c2f0dd-1b53-4917-9c42-a211b60ce35b","created":"2017-09-28T12:39:35Z","state":"enqueued"}

Note that thestate property in the response is set toenqueued. The POST request works asynchronously. That means it continues to run in the background after the request returns a response. Possible values forstate include:enqueued,running,completed,failed.

Use the URL returned in theself property to view the status of the data export request, as described inViewing the status of an analytics export request. When the request completes, the value of thestate property in the response is set tocompleted. You can then access the analytics data in your datastore.

For more information, see theCreate data export API.

Example 1: Export data to Cloud Storage

The following example exports a complete set of raw data for the last 24 hoursfrom thetest environment in themyorg organization.The content is exported to Cloud Storage in JSON:

curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports" \  -X POST \  -H "Content-type:application/json" \  -H "Authorization: Bearer $TOKEN" \  -d \  '{    "name": "Export raw results to Cloud Storage",    "description": "Export raw results to Cloud Storage for last 24 hours",    "dateRange": {      "start": "2020-06-08",      "end": "2020-06-09"    },    "outputFormat": "json",    "datastoreName": "My Cloud Storage data repository"  }'

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

Use the URI specified by theself property to monitor the job status as describedinViewing the status of an analytics export request.

Example 2: Export data to BigQuery

Note: TheData location for the BigQuery dataset must be the same as theanalytics_location. For analytics regions in the US or EU:

Note: Exporting over 100 GBs of data to BigQuery may take up to 3 to 4 minutes to complete.

The following example exports a comma-delimited CSV file to BigQuery:

curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports" \  -X POST \  -H "Content-type:application/json" \  -H "Authorization: Bearer $TOKEN" \  -d \  '{    "name": "Export query results to BigQuery",    "description": "One-time export to BigQuery",    "dateRange": {      "start": "2018-06-08",       "end": "2018-06-09"    },    "outputFormat": "csv",    "csvDelimiter": ",",     "datastoreName": "My BigQuery data repository"  }'

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

Note: The exported CSV file creates a BigQuery table with the following prefix:

<PREFIX>_<EXPORT_DATE>_api_<UUID>_from_<FROM_DATE>_to_<TO_DATE>

Use the URI specified by theself property to monitor the job status as described inViewing the status of an analytics export request.

About export API quotas

To prevent overuse of expensive data export API calls,Apigee enforces a quota of 15 calls per day per organization on calls to theorganizations/{org}/environments/{env}/analytics/exports API.

If you exceed the call quota, the API returns an HTTP 429 response.

Viewing the status of all analytics export requests

To view the status for all analytics export requests, issue aGET request to/organizations/{org}/environments/{env}/analytics/exports.

For example, the following request returns the status of all analytics export requests for thetest environment in themyorg organization:

curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports" \  -X GET \  -H "Authorization: Bearer $TOKEN"

Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.

The following provides an example of the response listing two export requests, one enqueued (created and in the queue) and one completed:

[{"self":"/v1/organizations/myorg/environments/test/analytics/exports/e8b8db22-fe03-4364-aaf2-6d4f110444ba","name":"Export results To Cloud Storage","description":"One-time export to Cloud Storage","userId":"my@email.com","datastoreName":"My datastore","executionTime":"36 seconds","created":"2018-09-28T12:39:35Z","updated":"2018-09-28T12:39:42Z","state":"enqueued"},{"self":"/v1/organizations/myorg/environments/test/analytics/exports/9870987089fe03-4364-aaf2-6d4f110444ba""name":"Export raw results to BigQuery","description":"One-time export to BigQuery",...}]

For more information, seeList data exports API.

Viewing the status of an analytics export request

To view the status of a specific analytics export request, issue aGET request to/organizations/{org}/environments/{env}/analytics/exports/{exportId},where{exportId} is the ID associated with the analytics export request.

Note:The URL to check the status of a request is returned in theself property when you submit an export request. To get the ID for an analytics export request, view a list of all analytics export requests and their associated IDs, as described inViewing the status of all analytics export requests.

For example, the following request returns the status of the analytics export request with the ID4d6d94ad-a33b-4572-8dba-8677c9c4bd98.

curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports/4d6d94ad-a33b-4572-8dba-8677c9c4bd98" \  -X GET \  -H "Authorization: Bearer $TOKEN"

The following provides an example of the response:

{"self":"/v1/organizations/myorg/environments/test/analytics/exports/4d6d94ad-a33b-4572-8dba-8677c9c4bd98","name":"Export results to Cloud Storage","description":"One-time export to Cloud Storage","userId":"my@email.com","datastoreName":"My datastore","executionTime":"36 seconds","created":"2018-09-28T12:39:35Z","updated":"2018-09-28T12:39:42Z","state":"enqueued"}

For more information, seeGet data export API.

If the analytics export returns no analytics data, thenexecutionTime is set to "0 seconds".

Datastore request property reference

The following table describes the properties that you can pass in the request body in JSON format when creating a datastore based on the datastore type.

For Google Cloud Storage:

PropertyDescriptionRequired?
Project IDGoogle Cloud Platform project ID.

To create a Google Cloud Platform project, seeCreating and Managing Projects in the Google Cloud Platform documentation.

Yes
Bucket NameName of the bucket in Cloud Storage to which you want to export analytics data.

Note: The bucket must exist before you perform a data export.

To create a Cloud Storage bucket, seeCreate buckets in the Google Cloud Platform documentation.

Yes
PathDirectory in which to store the analytics data in the Cloud Storage bucket.Yes

For BigQuery:

PropertyDescriptionRequired?
Project IDGoogle Cloud Platform project ID.

To create a Google Cloud Platform project, seeCreating and managing projects in the Google Cloud Platform documentation.

Yes
Dataset NameName of the BigQuery dataset to which you want to export analytics data. Ensure that the dataset is created before requesting data export.

To create a BigQuery dataset, seeCreating and using datasets in the Google Cloud Platform documentation.

Yes
Table PrefixThe prefix for the names of the tables created for the analytics data in the BigQuery dataset.Yes

Export request property reference

The following table describes the properties that you can pass in the request body in JSON format when exporting analytics data.

PropertyDescriptionRequired?
descriptionDescription of the export request.No
nameName of the export request.Yes
dateRange

Specify thestart andend date of the data to export, in the formatyyyy-mm-dd. For example:

"dateRange": {    "start": "2018-07-29",    "end": "2018-07-30"}

ThedateRange value can only span one day. The date range begins at 00:00:00 UTC on thestart date and ends at 00:00:00 UTC on theend date.

Note: To ensure all data is captured from the previous day, you may need to delay the start time of the export request (for example, 00:05:00 AM UTC).

Yes
outputFormatSpecify as eitherjson orcsv.Yes
csvDelimiter

Delimiter used in the CSV output file, ifoutputFormat is set tocsv. Defaults to the , (comma) character. Supported delimiter characters include comma (,), pipe (|), and tab (\t).

No
datastoreNameThe name of the datastore containing the definition of your datastore.Yes

For example:

{  "name": "Export raw results to Cloud Storage",  "description": "Export raw results to Cloud Storage for last 24 hours",  "datastoreName": "My Cloud Storage datastore"}

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-17 UTC.