Exporting data from Analytics Stay organized with collections Save and categorize content based on your preferences.
This pageapplies toApigee andApigee hybrid.
View Apigee Edge documentation.![]()
Apigee Analytics collects and analyzes a broad spectrum of data that flows across your APIs and provides visualization tools, including interactive dashboards, custom reports, and other tools that identify trends in API proxy performance.
Now, you can unlock this rich content by exporting analytics data from Apigee Analytics to your own data repository, such as Google Cloud Storage or Google BigQuery. You can then take advantage of the powerful query and machine learning capabilities offered by Google BigQuery and TensorFlow to perform your own data analysis. You can also combine the exported analytics data with other data, such as web logs, to gain new insights into your users, APIs, and applications.
Note: Free and trial accounts cannot export analytics data. For more information, seeApigee pricing plans.If you are a Pay-as-you-go customer, you must enable the Apigee API Analytics add-on in your eligible environments in order to use this API. If the Apigee API Analytics add-on is not enabled, or was disabled more than 30 days before using the API, you will see an error that you are unable to access the API. For more information, see Manage the Apigee API Analytics add-on.
What export data formats are supported?
Export analytics data to one of the following formats:
Comma-separated values (CSV)
The default delimiter is a comma (,) character. Supported delimiter characters include comma (,), pipe (|), and tab (\t). Configure the value using the
csvDelimiterproperty, as described inExport request property reference .JSON (newline delimited)
Allows the newline character to be used as a delimiter.
The exported data includes all the analytics metrics and dimensions built into Apigee, andany custom analytics data that you add. For a description of the exported data,seeAnalytics metrics, dimensions, and filters reference.
You can export analytics data to the following data repositories:
Steps to export your analytics data
The following steps summarize the process used to export your analytics data:
Note:You must haveApigee Analytics Editor andApigee Org Administrator permissions to manage datastores and export data. SeeUnderstanding roles.
- Configure your data repository (Cloud Storage or BigQuery) for data export. You must ensure that your data repository has been configured correctly, and that the Apigee Service Agent service account used to write data to the data repository has the correct permissions.
- Create a datastore that defines the properties of the data repository (Cloud Storage or BigQuery) where you export your data.
- Export your analytics data. The data export runs asynchronously in the background.
- View the status of the export request to determine when the export completes.
- When the export is complete, access the exported data in your data repository.
The following sections describe these steps in more detail.
Configuring your data repository
Configure Cloud Storage or BigQuery to enable access by analytics data export.
Configuring Google Cloud Storage
Before you can export data to Google Cloud Storage you need to do the following:
Create aGoogle Cloud storage bucket.
Ensure that theBigQuery API is enabled in your Google Cloud Platform project. Apigee uses the BigQuery API to leverage BigQuery export features when exporting to Cloud Storage.
SeeEnabling APIs for instructions.
Ensure that theApigee Service Agent service account with email address
service-project-number@gcp-sa-apigee.iam.gserviceaccount.comis assigned to the following roles:- BigQuery Job User
- Storage Admin
Theproject-number is listed on the project home page, as shown below.

SeeGranting, changing, and revoking access to resources.
Alternatively, if you want to modify an existing role, or create a custom role, add the following permissions to the role:
bigquery.jobs.createstorage.objects.createstorage.objects.deletestorage.objects.list
Configuring Google BigQuery
Note: Analytics data for the US or EU is stored in either the US or EU multi-region. If you export the data to BigQuery, theData location for the BigQuery dataset must be the same multi-region: You cannot export the data directly to an individual region in the US or EU. See Exporting data to BigQuery for an individual region in the US or EU for a workaround.
Before you can export data to Google BigQuery:
- Ensure that you have enabled BigQuery in your Google Cloud Platform project.
- Ensure that theBigQuery API is enabled in your Google Cloud Platform project.SeeEnabling APIs for instructions.
Ensure that theApigee Service Agent service account with email address
service-project-number@gcp-sa-apigee.iam.gserviceaccount.comis assigned to the following roles:- BigQuery Job User
- BigQuery Data Editor
Theproject-number is listed on the project home page, as shown below.

SeeGranting, changing, and revoking access to resources.
If you want to modify an existing role, or create a custom role, add the following permissions to the role:
bigquery.datasets.createbigquery.datasets.getbigquery.jobs.createbigquery.tables.createbigquery.tables.getbigquery.tables.updateData
Create aBigQuery dataset.
Exporting data to BigQuery for an individual region in the US or EU
Since analytics data for the US or EU is stored in either the US or EU multi-region, you cannot export the data directly to an individual US or EU region in BigQuery. As a workaround, you can first export the data to Google Cloud Storage, and then transfer it to BigQuery as follows:
- Create a Cloud Storage bucket, and setLocation to the individual region in the US or EU that you want associated to your data in BigQuery.
- Create aCloud Storage datastore, using the storage bucket created in the previous step.
- Export the data to Cloud Storage. See Example 1: Export data to Cloud Storage below for an example.
- Load the data to BigQuery, as described in the following sections:
Managing datastores
Thedatastore defines the connection to your export data repository (Cloud Storage, BigQuery).
The following sections describe how to create and manage your datastores. Before you create a datastore, it is recommended that youtest the data repository configuration.
Testing the data repository configuration
When you create the data repository, Apigee does not test or validate that the configuration is valid. That means you can create the datastore (in the next step) and not detect any errors until you run your first data export.
Because a data export process can take a long time to execute, you can detect errors sooner by testing the data repository configuration to ensure it is valid, and fixing any errors before creating the datastore.
To test the data repository configuration, issue a POST request to the/organizations/{org}/analytics/datastores:test API. Pass the following information in the request body:
- Display name
- Datastore type
- Configuration details based on the datastore type, as described inDatastore request property reference.
For example, the following tests a Cloud Storage data repository configuration:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores:test" \ -X POST \ -H "Content-type:application/json" \ -H "Authorization: Bearer $TOKEN" \ -d \ '{ "displayName": "My Cloud Storage datastore", "targetType": "gcs", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" } }'The following provides an example of the response if the test is successful:
{ "state": "completed",}The following provides an example of the response if the test failed:
{ "state": "failed", "error": "<error message>"}In this case, address the issues raised in the error message and re-test the data repository configuration. After a successful test, create the datastore, as described in the next section.
Creating a datastore
To create a datastore, issue a POST request to the/organizations/{org}/analytics/datastores API. Pass the following information in the request body:
- Display name
- Datastore type
- Configuration details based on the datastore type, as described inDatastore request property reference.
Examples are provided below for each datastore type.
The following provides an example of the response for a Cloud Storage data repository:
{ "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b", "displayName": "My Cloud Storage datastore", "org": "myorg", "targetType": "gcs", "createTime": "1535411583949", "lastUpdateTime": "1535411634291", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" }}Use the URL returned in theself property to view the datastore details, as described inViewing the details of a datastore.
For more information, see theCreate data store API.
Example 1: Create a Cloud Storage datastore
The following request creates a Cloud Storage datastore:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores" \ -X POST \ -H "Content-type:application/json" \ -H "Authorization: Bearer $TOKEN" \ -d \ '{ "displayName": "My Cloud Storage datastore", "targetType": "gcs", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" } }'Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
Example 2: Create a BigQuery datastore
The following request creates a BigQuery datastore:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores" \ -X POST \ -H "Content-type:application/json" \ -H "Authorization: Bearer $TOKEN" \ -d \ '{ "displayName": "My BigQuery datastore", "targetType": "bigquery", "datastoreConfig": { "projectId": "my-project", "datasetName": "mybigquery", "tablePrefix": "bqprefix" } }'Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
Viewing all datastores
To view all datastores for your organization, issue a GET request to the/organizations/{org}/analytics/datastores API.
For example:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores" \ -X GET \ -H "Authorization: Bearer $TOKEN"
Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
The following provides an example of the response:
{ "datastores": [ { "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b", "displayName": "My Cloud Storage datastore", "org": "myorg", "targetType": "gcs", "createTime": "1535411583949", "lastUpdateTime": "1535411634291", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" } }, { "self": "/organizations/myorg/analytics/datastores/g8c3f0mk-1f78-8837-9c67-k222b60ce30b", "displayName": "My BigQuery datastore", "org": "myorg", "targetType": "bigquery", "createTime": "1535411583949", "lastUpdateTime": "1535411634291", "datastoreConfig": { "projectId": "my-project", "datasetName": "mybigquery", "tablePrefix": "bqprefix" } } ]}For more information, see theList data stores API.
Viewing the details for a datastore
To view the details for a datastore, issue a GET request to the/organizations/{org}/analytics/datastores/{datastore} API.
For example:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b" \ -X GET \ -H "Authorization: Bearer $TOKEN"
Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
The following provides an example of the response for a Cloud Storage datastore:
{ "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b", "displayName": "My Cloud Storage datastore", "org": "myorg", "targetType": "gcs", "createTime": "1535411583949", "lastUpdateTime": "1535411634291", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" }}For more information, seeGet data store API.
Modifying a datastore
To modify a datastore, issue a PUT request to the/organizations/{org}/analytics/datastores/{datastore} API. Pass all or a subset of the following information in the request body:
- Datastore display name
- Configuration details based on the datastore type, as described inDatastore request property reference.
For example, to update a Cloud Storage datastore:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b" \ -X PUT \ -H "Content-type:application/json" \ -H "Authorization: Bearer $TOKEN" \ -d \ '{ "displayName": "My Cloud Storage datastore", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" } }'Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
The following provides an example of the response for a Cloud Storage datastore:
{ "self": "/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b", "displayName": "My Cloud Storage datastore", "org": "myorg", "targetType": "gcs", "createTime": "1535411583949", "lastUpdateTime": "1535411634291", "datastoreConfig": { "projectId": "my-project", "bucketName": "my-bucket", "path": "my/analytics/path" }}For more information, see theUpdate data store API.
Deleting a datastore
To delete a datastore, issue a DELETE request to the/organizations/{org}/analytics/datastores/{datastore} API.
For example:
curl "https://apigee.googleapis.com/v1/organizations/myorg/analytics/datastores/c7d3b5aq-1c64-3389-9c43-b211b60de35b" \ -X DELETE \ -H "Authorization: Bearer $TOKEN"
Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
The following provides an example of the response:
{}For more information, see theDelete data store API.
Exporting analytics data
To export analytics data, issue a POST request to the/organizations/{org}/environments/{env}/analytics/exports API. Pass the following information in the request body:
- Name and description of the export request
- Date range of exported data (value can only span one day)
- Format of exported data
- Datastore name
Examples of export requests are provided below. For a complete description of the request body properties, seeExport request property reference.
The response from the POST is in the form:
{"self":"/organizations/myorg/environments/test/analytics/exports/a7c2f0dd-1b53-4917-9c42-a211b60ce35b","created":"2017-09-28T12:39:35Z","state":"enqueued"}Note that thestate property in the response is set toenqueued. The POST request works asynchronously. That means it continues to run in the background after the request returns a response. Possible values forstate include:enqueued,running,completed,failed.
Use the URL returned in theself property to view the status of the data export request, as described inViewing the status of an analytics export request. When the request completes, the value of thestate property in the response is set tocompleted. You can then access the analytics data in your datastore.
For more information, see theCreate data export API.
Example 1: Export data to Cloud Storage
The following example exports a complete set of raw data for the last 24 hoursfrom thetest environment in themyorg organization.The content is exported to Cloud Storage in JSON:
curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports" \ -X POST \ -H "Content-type:application/json" \ -H "Authorization: Bearer $TOKEN" \ -d \ '{ "name": "Export raw results to Cloud Storage", "description": "Export raw results to Cloud Storage for last 24 hours", "dateRange": { "start": "2020-06-08", "end": "2020-06-09" }, "outputFormat": "json", "datastoreName": "My Cloud Storage data repository" }'Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
Use the URI specified by theself property to monitor the job status as describedinViewing the status of an analytics export request.
Example 2: Export data to BigQuery
- If the region is part of the US, setData location tous (multiple regions in United States).
- If the analytics region is part of the EU, setData location toeu (multiple regions in European Union).
The following example exports a comma-delimited CSV file to BigQuery:
curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports" \ -X POST \ -H "Content-type:application/json" \ -H "Authorization: Bearer $TOKEN" \ -d \ '{ "name": "Export query results to BigQuery", "description": "One-time export to BigQuery", "dateRange": { "start": "2018-06-08", "end": "2018-06-09" }, "outputFormat": "csv", "csvDelimiter": ",", "datastoreName": "My BigQuery data repository" }'Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
Note: The exported CSV file creates a BigQuery table with the following prefix:
<PREFIX>_<EXPORT_DATE>_api_<UUID>_from_<FROM_DATE>_to_<TO_DATE>
Use the URI specified by theself property to monitor the job status as described inViewing the status of an analytics export request.
About export API quotas
To prevent overuse of expensive data export API calls,Apigee enforces a quota of 15 calls per day per organization on calls to theorganizations/{org}/environments/{env}/analytics/exports API.
If you exceed the call quota, the API returns an HTTP 429 response.
Viewing the status of all analytics export requests
To view the status for all analytics export requests, issue aGET request to/organizations/{org}/environments/{env}/analytics/exports.
For example, the following request returns the status of all analytics export requests for thetest environment in themyorg organization:
curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports" \ -X GET \ -H "Authorization: Bearer $TOKEN"
Where$TOKEN is set to your OAuth 2.0 access token, as described inObtaining an OAuth 2.0 access token. For information about thecurl options used in this example, seeUsing curl. For a description of environment variables you can use, seeSettingenvironment variables for Apigee API requests.
The following provides an example of the response listing two export requests, one enqueued (created and in the queue) and one completed:
[{"self":"/v1/organizations/myorg/environments/test/analytics/exports/e8b8db22-fe03-4364-aaf2-6d4f110444ba","name":"Export results To Cloud Storage","description":"One-time export to Cloud Storage","userId":"my@email.com","datastoreName":"My datastore","executionTime":"36 seconds","created":"2018-09-28T12:39:35Z","updated":"2018-09-28T12:39:42Z","state":"enqueued"},{"self":"/v1/organizations/myorg/environments/test/analytics/exports/9870987089fe03-4364-aaf2-6d4f110444ba""name":"Export raw results to BigQuery","description":"One-time export to BigQuery",...}]For more information, seeList data exports API.
Viewing the status of an analytics export request
To view the status of a specific analytics export request, issue aGET request to/organizations/{org}/environments/{env}/analytics/exports/{exportId},where{exportId} is the ID associated with the analytics export request.
self property when you submit an export request. To get the ID for an analytics export request, view a list of all analytics export requests and their associated IDs, as described inViewing the status of all analytics export requests.For example, the following request returns the status of the analytics export request with the ID4d6d94ad-a33b-4572-8dba-8677c9c4bd98.
curl "https://apigee.googleapis.com/v1/organizations/myorg/environments/test/analytics/exports/4d6d94ad-a33b-4572-8dba-8677c9c4bd98" \ -X GET \ -H "Authorization: Bearer $TOKEN"
The following provides an example of the response:
{"self":"/v1/organizations/myorg/environments/test/analytics/exports/4d6d94ad-a33b-4572-8dba-8677c9c4bd98","name":"Export results to Cloud Storage","description":"One-time export to Cloud Storage","userId":"my@email.com","datastoreName":"My datastore","executionTime":"36 seconds","created":"2018-09-28T12:39:35Z","updated":"2018-09-28T12:39:42Z","state":"enqueued"}For more information, seeGet data export API.
If the analytics export returns no analytics data, thenexecutionTime is set to "0 seconds".
Datastore request property reference
The following table describes the properties that you can pass in the request body in JSON format when creating a datastore based on the datastore type.
For Google Cloud Storage:
| Property | Description | Required? |
|---|---|---|
| Project ID | Google Cloud Platform project ID. To create a Google Cloud Platform project, seeCreating and Managing Projects in the Google Cloud Platform documentation. | Yes |
| Bucket Name | Name of the bucket in Cloud Storage to which you want to export analytics data. Note: The bucket must exist before you perform a data export. To create a Cloud Storage bucket, seeCreate buckets in the Google Cloud Platform documentation. | Yes |
| Path | Directory in which to store the analytics data in the Cloud Storage bucket. | Yes |
For BigQuery:
| Property | Description | Required? |
|---|---|---|
| Project ID | Google Cloud Platform project ID. To create a Google Cloud Platform project, seeCreating and managing projects in the Google Cloud Platform documentation. | Yes |
| Dataset Name | Name of the BigQuery dataset to which you want to export analytics data. Ensure that the dataset is created before requesting data export. To create a BigQuery dataset, seeCreating and using datasets in the Google Cloud Platform documentation. | Yes |
| Table Prefix | The prefix for the names of the tables created for the analytics data in the BigQuery dataset. | Yes |
Export request property reference
The following table describes the properties that you can pass in the request body in JSON format when exporting analytics data.
| Property | Description | Required? |
|---|---|---|
description | Description of the export request. | No |
name | Name of the export request. | Yes |
dateRange | Specify the "dateRange": { "start": "2018-07-29", "end": "2018-07-30"}The Note: To ensure all data is captured from the previous day, you may need to delay the start time of the export request (for example, 00:05:00 AM UTC). | Yes |
outputFormat | Specify as eitherjson orcsv. | Yes |
csvDelimiter | Delimiter used in the CSV output file, if | No |
datastoreName | The name of the datastore containing the definition of your datastore. | Yes |
For example:
{ "name": "Export raw results to Cloud Storage", "description": "Export raw results to Cloud Storage for last 24 hours", "datastoreName": "My Cloud Storage datastore"}Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-17 UTC.