Connect to an SAP Ariba batch source

This page describes how to connect your data pipeline to an SAP Ariba Source anda BigQuery Sink. You can configure and execute bulk data transfers fromAriba without any coding using the SAP Ariba Batch Source plugin from theCloud Data Fusion Hub.

The plugin extracts data from the reporting facts provided in the SAP AribaSource. Each fact corresponds with an SAP Ariba Document Type. Facts are exposedin view templates, which are accessed through the Analytical Reporting API.

For more information, see theSAP Ariba Batch Source reference.

Before you begin

  • Create an instance inCloud Data Fusion version 6.5.1 or later. If your instance uses anearlier version,upgrade your Cloud Data Fusion environment.

  • An SAP Ariba User must do the following:

    • Create an application and generate the OAuth credentials.
    • Grant access to the Analytical Reporting API in the Ariba developerportal.
  • Retrieve the name of the reporting view template from the SAP AribaAnalytical Reporting - View Management API by sending aGET request. SeeIdentifying Analytical reporting API view templates.

  • Optional: To prevent pipeline failures due to rate limits, identify theexpected record count. The plugin extracts data from facts and dimensionsthrough the SAP Ariba Analytical Reporting API, where rate limits apply. Formore information, seeManage rate limits.

Deploy and configure the plugin

  1. Deploy the SAP Ariba Batch Source plugin from the SAP tab of the Hub. Formore information, seeDeploy a plugin from the Hub.

  2. Open the pipeline on the Cloud Data FusionStudio page and selectData Pipeline - Batch. The plugin doesn't support real-time pipelines.

  3. In the source menu, clickSAP Ariba. The SAP Ariba Batch Source nodeappears in the pipeline.

  4. Go to the node and clickProperties. An Ariba Properties window opens.

  5. Configure the properties.

  6. ClickValidate and resolve any errors.

  7. ClickClose.

Optional: Connect the plugin to a BigQuery Sink

  1. On the Cloud Data FusionStudio page, go to theSink menu andclickBigQuery.

    The BigQuery Sink node appears in the pipeline.

  2. Configure the required properties of the sink.

  3. ClickValidate and resolve any errors.

  4. ClickClose.

Optional: Manage rate limits

To check the records count for a specific date range in SAP Ariba, seeDate-related filters for the Analytical Reporting API.

For more information, seeLimits for the plugin.

The following table describes ways to troubleshoot issues with rate limits.

Example pipelineCount of records and required API callsRemaining daily limitTroubleshooting
I want to extract data from one view template for a specific date range.
12,020,000 records, 41 calls-1 of 40The required API calls for this date range and record count exceed the daily limit (40). To reduce the number of calls, select a smaller date range to decrease the count of records.
I want to extract data from multiple view templates for a specific date range.
150,001 records, 2 calls38 of 40
2100,000 records, 2 calls36 of 40
3100 records, 1 call35 of 40
41,000,000 records, 20 calls15 of 40
5500,000 records, 10 calls5 of 40
6500,000 records, 10 calls-5 of 40Pipeline 6 exceeds the limit for API calls. To prevent errors, run the extraction a day later or change the date range.

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.