SAP SuccessFactors batch source Stay organized with collections Save and categorize content based on your preferences.
This page describes how to extract data from any entity within theSAP SuccessFactors Employee Central module intoGoogle Cloud with Cloud Data Fusion.
Note: This documentation applies to Cloud Data Fusion 6.7 and later, and SAP SuccessFactors plugin 1.2.3 and later.For more information, see theoverview of SAP on Google Cloud.
Before you begin
Set up the following systems and services that are used by the SAPSuccessFactors plugin:
- Configure the SAP SuccessFactors system. You mustset up permissions in your SAP system.
- Deploy the SAP SuccessFactors plugin inCloud Data Fusion. You must deploy a plugin version that'scompatible with the Cloud Data Fusion version.
- If you upgrade the version of your Cloud Data Fusioninstance or plugin, evaluate the impact of the changes to thepipeline's functional scope and performance.
- Establish connectivity between Cloud Data Fusion and SAPSuccessFactors.
- Ensure that communication is enabled between theCloud Data Fusion instance and the SAP SuccessFactors instance.
- For private instances, set upVPC network peering.
Configure the plugin
- Go to the Cloud Data Fusion web interfaceand clickStudio.
- Check thatData Pipeline - Batch is selected (notRealtime).
- In theSource menu, clickSuccessFactors. The SAP SuccessFactorsnode appears in your pipeline.
- To configure the source, go to the SAP SuccessFactors node and clickProperties.
Enter the following properties. For a complete list, seeProperties.
- Enter aLabel for the SAP SuccessFactorsnode—forexample,
SAP SuccessFactors tables. Enter the connection details. You can set up a new, one-time connection,or an existing, reusable connection.
One-time connection
To add a one-time connection to SAP, follow thesesteps:
- KeepUse connection turned off.
In theConnection section, enter the following informationfrom the SAP account in these fields:
- Provide the SAP credentials.
- In theSAP SuccessFactors Base URL field, enter your SAPSuccessFactors account base URL.
- In theReference name field, enter a name for theconnection that identifies this source for lineage.
- In theEntity Name field, enter the name of the entityyou're extracting—for example,
people. - To generate a schema based on the metadata from SAP that mapsSAP data types to corresponding Cloud Data Fusion datatypes, clickGet schema. For more information, seeData type mappings.
- In theProxy URL field, enter the Proxy URL, including theprotocol, address, and port.
Optional: to optimize the ingestion load from SAP, enter thefollowing information:
- To extract records based on selection conditions, clickFilter options andSelect fields.
- In theExpand fields, enter a list of navigation fieldsto be expanded in the extracted output data. For example,
customManager. - InAdditional query parameters, enter parameters to addto the URL—for example,
fromDate=2023-01-01&toDate=2023-01-31. - In theAssociated entity name field, enter the name of theentity to be extracted—for example,
EmpCompensationCalculated. - In thePagination type field, enter a type—for example,
Server-side pagination.
Reusable connection
To reuse an existing connection, follow these steps:
- Turn onUse connection.
- ClickBrowse connections.
Click the connection name.
Note: For more information about adding, importing, and editingthe connections that appear when you browse connections, seeManage connections.
If a connection doesn't exist, create a reusable connection byfollowing these steps:
- ClickAdd connection> SAP SuccessFactors.
- On theCreate a SAP SuccessFactors connection page that opens,enter a connection name and description.
- Provide the SAP credentials. You can ask the SAP administratorfor the SAP logon username and password values.
- In theProxy URL field, enter the Proxy URL, including theprotocol, address, and port.
- ClickCreate.
- Enter aLabel for the SAP SuccessFactorsnode—forexample,
Properties
| Property | Macro enabled | Required property | Description |
|---|---|---|---|
| Label | No | Yes | The name of the node in your data pipeline. |
| Use connection | No | No | Use a reusable connection. If a connection is used, you don't need to provide the credentials. For more information, seeManage connections. |
| Name | No | Yes | The name of the reusable connection. |
| Reference Name | No | Yes | Uniquely identifies the source for lineage and annotates the metadata. |
| SAP SuccessFactors Base URL | Yes | Yes | The base URL of SuccessFactors API. |
| Entity Name | Yes | Yes | The name of the Entity to be extracted. Doesn't support entities that have properties with the Binary data type or large volumes of data. For example,UserBadges andBadgeTemplates aren't supported. |
| SAP SuccessFactors Username | Yes | Yes | The user ID for authentication, similar toUSER_ID@COMPANY_ID. For example,sfadmin@cymbalgroup. |
| SAP SuccessFactors Password | Yes | Yes | The SAP SuccessFactors Password for user authentication. |
| Filter Options | Yes | No | The filter condition that restricts the output data volume, for example,Price gt 200. See thesupported filter options. |
| Select Fields | Yes | No | Fields to be preserved in the extracted data. For example,Category,Price,Name,Address. If the field is left blank, then all the non-navigation fields will be preserved in the extracted data.All fields must be comma (,) separated. |
| Expand Fields | Yes | No | List of navigation fields to be expanded in the extracted output data. For example,customManager. If an entity has hierarchical records, the source outputs a record for each row in the entity it reads, with each record containing an extra field that holds the value from the navigational property specified in the Expand Fields. |
| Associated Entity Name | Yes | No | Name of the Associated Entity that is being extracted. For example,EmpCompensationCalculated. |
| Pagination Type | Yes | Yes | The type of pagination to be used. Server-side pagination uses snapshot-based pagination. If snapshot-based pagination is attempted on an entity that doesn't support the feature, the server automatically forces client-offset pagination on the query. Examples of entities that only support server-side pagination are BadgeTemplates,UserBadges, andEPCustomBackgroundPortlet. No records are transferred if client-side pagination is chosen on these entities, as it relies on the Count API, which returns-1 as the response.Default isServer-side Pagination. |
Supported filter options
The following operators are supported:
| Operator | Description | Example |
|---|---|---|
| Logical Operators | ||
Eq | Equal | /EmpGlobalAssignment?$filter=assignmentClass eq 'GA' |
Ne | Not equal | /RecurringDeductionItem?$filter=amount ne 18 |
Gt | Greater than | /RecurringDeductionItem?$filter=amount gt 4 |
Ge | Greater than or equal | /RecurringDeductionItem?$filter=amount ge 18 |
Lt | Less than | /RecurringDeductionItem?$filter=amount lt 18 |
Le | Less than or equal | /RecurringDeductionItem?$filter=amount le 20 |
And | Logical and | /RecurringDeductionItem?$filter=amount le 20 and amount gt 4 |
Or | Logical or | /RecurringDeductionItem?$filter=amount le 20 or amount gt 4 |
Not | Logical negation | /RecurringDeductionItem?$filter=not endswith(payComponentType, 'SUPSPEE_US') |
| Arithmetic Operators | ||
Add | Addition | /RecurringDeductionItem?$filter=amount add 5 gt 18 |
Sub | Subtraction | /RecurringDeductionItem?$filter=amount sub 5 gt 18 |
Mul | Multiplication | /RecurringDeductionItem?$filter=amount mul 2 gt 18 |
Div | Division | /RecurringDeductionItem?$filter=amount div 2 gt 18 |
Mod | Modulo | /RecurringDeductionItem?$filter=amount mod 2 eq 0 |
| Grouping Operators | ||
( ) | Precedence grouping | /RecurringDeductionItem?$filter=(amount sub 5) gt 8 |
Data type mappings
The following table is a list of SAP data types with corresponding Cloud Data Fusion types.
| SuccessFactors Data Type | Cloud Data Fusion Schema Data Type |
|---|---|
Binary | Bytes |
Boolean | Boolean |
Byte | Bytes |
DateTime | DateTime |
DateTimeOffset | Timestamp_Micros |
Decimal | Decimal |
Double | Double |
Float | Float |
Int16 | Integer |
Int32 | Integer |
Int64 | Long |
SByte | Integer |
String | String |
Time | Time_Micros |
Use cases
The following example use case is the data for a single employee inEmployeePayrollRunResults:
| Example property | Example value |
|---|---|
| externalCode | SAP_EC_PAYROLL_1000_0101201501312015_456_416 |
| Person ID | 456 |
| User | user-1 |
| Employment ID | 416 |
| Payroll Provider ID | SAP_EC_PAYROLL |
| Start of Effective Payment Period | 01/01/2015 |
| End of Effective Payment Period | 01/31/2015 |
| Company ID | BestRun Germany (1000) |
| Payout | 01/28/2015 |
| Currency | EUR (EUR) |
| Payroll Run Type | Regular (REGULAR) |
| System ID | X0B |
The example shows the results for an employee inEmployeePayrollRunResults:
EmployeePayrollRunResults_externalCode | EmployeePayrollRunResults_mdfSystemEffectiveStartDate | amount | createdBy | createdDate |
|---|---|---|---|---|
SAP_EC_PAYROLL_2800_0101201901312019_305_265 | 1/31/2019 0:00:00 | 70923.9 | sfadmin | 12/10/2019 15:32:20 |
SAP_EC_PAYROLL_2800_0101201901312019_310_270 | 1/31/2019 0:00:00 | 64500 | sfadmin | 12/10/2019 15:32:20 |
SAP_EC_PAYROLL_2800_0201201902282019_305_265 | 2/28/2019 0:00:00 | 70923.9 | sfadmin | 12/10/2019 15:32:20 |
SAP_EC_PAYROLL_2800_0201201902282019_310_270 | 2/28/2019 0:00:00 | 64500 | sfadmin | 12/10/2019 15:32:20 |
SAP_EC_PAYROLL_2800_0301201903312019_305_265 | 3/31/2019 0:00:00 | 70923.9 | sfadmin | 12/10/2019 15:32:20 |
Example pipeline
See sample configurations in the following JSON file:
{"artifact":{"name":"data-pipeline-1","version":"DATA_FUSION_VERSION","scope":"SYSTEM"},"description":"","name":"Demo_SuccessFactors_BatchSource","config":{"resources":{"memoryMB":2048,"virtualCores":1},"driverResources":{"memoryMB":2048,"virtualCores":1},"connections":[{"from":"SAP SuccessFactors","to":"BigQuery"}],"comments":[],"postActions":[],"properties":{},"processTimingEnabled":true,"stageLoggingEnabled":false,"stages":[{"name":"SAP SuccessFactors","plugin":{"name":"SuccessFactors","type":"batchsource","label":"SAP SuccessFactors","artifact":{"name":"successfactors-plugins","version":"PLUGIN_VERSION","scope":"USER"},"properties":{"useConnection":"false","username":"${username}","password":"${password}","baseURL":"${baseUrl}","referenceName":"test","entityName":"${EmpCompensation}","proxyUrl":"${ProxyUrl}","paginationType":"serverSide","initialRetryDuration":"2","maxRetryDuration":"300","maxRetryCount":"3","retryMultiplier":"2","proxyUsername":"${Proxyusername}","proxyPassword":"${Proxypassword}"}},"outputSchema":[{"name":"etlSchemaBody","schema":""}],"id":"SAP-SuccessFactors"},{"name":"BigQuery","plugin":{"name":"BigQueryTable","type":"batchsink","label":"BigQuery","artifact":{"name":"google-cloud","version":"BIGQUERY_PLUGIN_VERSION","scope":"SYSTEM"},"properties":{"useConnection":"false","project":"auto-detect","serviceAccountType":"filePath","serviceFilePath":"auto-detect","referenceName":"Reff","dataset":"SF_Aug","table":"testdata_proxy","operation":"insert","truncateTable":"true","allowSchemaRelaxation":"true","location":"US","createPartitionedTable":"false","partitioningType":"TIME","partitionFilterRequired":"false"}},"outputSchema":[{"name":"etlSchemaBody","schema":""}],"inputSchema":[{"name":"SAP SuccessFactors","schema":""}],"id":"BigQuery"}],"schedule":"0 1 */1 * *","engine":"spark","numOfRecordsPreview":100,"rangeRecordsPreview":{"min":1,"max":"5000"},"description":"Data Pipeline Application","maxConcurrentRuns":1,"pushdownEnabled":false,"transformationPushdown":{}}}
What's next
- Learn more aboutSAP on Google Cloud.
Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-12-15 UTC.