Connect to an SAP Ariba batch source Stay organized with collections Save and categorize content based on your preferences.
This page describes how to connect your data pipeline to an SAP Ariba Source anda BigQuery Sink. You can configure and execute bulk data transfers fromAriba without any coding using the SAP Ariba Batch Source plugin from theCloud Data Fusion Hub.
The plugin extracts data from the reporting facts provided in the SAP AribaSource. Each fact corresponds with an SAP Ariba Document Type. Facts are exposedin view templates, which are accessed through the Analytical Reporting API.
For more information, see theSAP Ariba Batch Source reference.
Before you begin
Create an instance inCloud Data Fusion version 6.5.1 or later. If your instance uses anearlier version,upgrade your Cloud Data Fusion environment.
An SAP Ariba User must do the following:
- Create an application and generate the OAuth credentials.
- Grant access to the Analytical Reporting API in the Ariba developerportal.
Retrieve the name of the reporting view template from the SAP AribaAnalytical Reporting - View Management API by sending a
GETrequest. SeeIdentifying Analytical reporting API view templates.Optional: To prevent pipeline failures due to rate limits, identify theexpected record count. The plugin extracts data from facts and dimensionsthrough the SAP Ariba Analytical Reporting API, where rate limits apply. Formore information, seeManage rate limits.
Deploy and configure the plugin
Deploy the SAP Ariba Batch Source plugin from the SAP tab of the Hub. Formore information, seeDeploy a plugin from the Hub.
Open the pipeline on the Cloud Data FusionStudio page and selectData Pipeline - Batch. The plugin doesn't support real-time pipelines.
In the source menu, clickSAP Ariba. The SAP Ariba Batch Source nodeappears in the pipeline.
Go to the node and clickProperties. An Ariba Properties window opens.
ClickValidate and resolve any errors.
ClickClose.
Optional: Connect the plugin to a BigQuery Sink
On the Cloud Data FusionStudio page, go to theSink menu andclickBigQuery.
The BigQuery Sink node appears in the pipeline.
Configure the required properties of the sink.
ClickValidate and resolve any errors.
ClickClose.
Optional: Manage rate limits
To check the records count for a specific date range in SAP Ariba, seeDate-related filters for the Analytical Reporting API.
For more information, seeLimits for the plugin.
The following table describes ways to troubleshoot issues with rate limits.
| Example pipeline | Count of records and required API calls | Remaining daily limit | Troubleshooting |
|---|---|---|---|
| I want to extract data from one view template for a specific date range. | |||
| 1 | 2,020,000 records, 41 calls | -1 of 40 | The required API calls for this date range and record count exceed the daily limit (40). To reduce the number of calls, select a smaller date range to decrease the count of records. |
| I want to extract data from multiple view templates for a specific date range. | |||
| 1 | 50,001 records, 2 calls | 38 of 40 | |
| 2 | 100,000 records, 2 calls | 36 of 40 | |
| 3 | 100 records, 1 call | 35 of 40 | |
| 4 | 1,000,000 records, 20 calls | 15 of 40 | |
| 5 | 500,000 records, 10 calls | 5 of 40 | |
| 6 | 500,000 records, 10 calls | -5 of 40 | Pipeline 6 exceeds the limit for API calls. To prevent errors, run the extraction a day later or change the date range. |
What's next
- Learn more aboutCloud Data Fusion integrations for SAP.
- Refer to theSAP Ariba Batch Source reference.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.