SAP SuccessFactors batch source

This page describes how to extract data from any entity within theSAP SuccessFactors Employee Central module intoGoogle Cloud with Cloud Data Fusion.

Note: This documentation applies to Cloud Data Fusion 6.7 and later, and SAP SuccessFactors plugin 1.2.3 and later.

For more information, see theoverview of SAP on Google Cloud.

Before you begin

Set up the following systems and services that are used by the SAPSuccessFactors plugin:

  1. Configure the SAP SuccessFactors system. You mustset up permissions in your SAP system.
  2. Deploy the SAP SuccessFactors plugin inCloud Data Fusion. You must deploy a plugin version that'scompatible with the Cloud Data Fusion version.
    • If you upgrade the version of your Cloud Data Fusioninstance or plugin, evaluate the impact of the changes to thepipeline's functional scope and performance.
  3. Establish connectivity between Cloud Data Fusion and SAPSuccessFactors.
    • Ensure that communication is enabled between theCloud Data Fusion instance and the SAP SuccessFactors instance.
    • For private instances, set upVPC network peering.

Configure the plugin

  1. Go to the Cloud Data Fusion web interfaceand clickStudio.
  2. Check thatData Pipeline - Batch is selected (notRealtime).
  3. In theSource menu, clickSuccessFactors. The SAP SuccessFactorsnode appears in your pipeline.
  4. To configure the source, go to the SAP SuccessFactors node and clickProperties.
  5. Enter the following properties. For a complete list, seeProperties.

    1. Enter aLabel for the SAP SuccessFactorsnode—forexample,SAP SuccessFactors tables.
    2. Enter the connection details. You can set up a new, one-time connection,or an existing, reusable connection.

      One-time connection

      To add a one-time connection to SAP, follow thesesteps:

      1. KeepUse connection turned off.
      2. In theConnection section, enter the following informationfrom the SAP account in these fields:

        1. Provide the SAP credentials.
        2. In theSAP SuccessFactors Base URL field, enter your SAPSuccessFactors account base URL.
        3. In theReference name field, enter a name for theconnection that identifies this source for lineage.
        4. In theEntity Name field, enter the name of the entityyou're extracting—for example,people.
        5. To generate a schema based on the metadata from SAP that mapsSAP data types to corresponding Cloud Data Fusion datatypes, clickGet schema. For more information, seeData type mappings.
        6. In theProxy URL field, enter the Proxy URL, including theprotocol, address, and port.
        7. Optional: to optimize the ingestion load from SAP, enter thefollowing information:

          1. To extract records based on selection conditions, clickFilter options andSelect fields.
          2. In theExpand fields, enter a list of navigation fieldsto be expanded in the extracted output data. For example,customManager.
          3. InAdditional query parameters, enter parameters to addto the URL—for example,fromDate=2023-01-01&toDate=2023-01-31.
          4. In theAssociated entity name field, enter the name of theentity to be extracted—for example,EmpCompensationCalculated.
          5. In thePagination type field, enter a type—for example,Server-side pagination.

      Reusable connection

      To reuse an existing connection, follow these steps:

      1. Turn onUse connection.
      2. ClickBrowse connections.
      3. Click the connection name.

        Note: For more information about adding, importing, and editingthe connections that appear when you browse connections, seeManage connections.

      If a connection doesn't exist, create a reusable connection byfollowing these steps:

      1. ClickAdd connection> SAP SuccessFactors.
      2. On theCreate a SAP SuccessFactors connection page that opens,enter a connection name and description.
      3. Provide the SAP credentials. You can ask the SAP administratorfor the SAP logon username and password values.
      4. In theProxy URL field, enter the Proxy URL, including theprotocol, address, and port.
      5. ClickCreate.

Properties

PropertyMacro enabledRequired propertyDescription
LabelNoYesThe name of the node in your data pipeline.
Use connectionNoNoUse a reusable connection. If a connection is used, you don't need to provide the credentials. For more information, seeManage connections.
NameNoYesThe name of the reusable connection.
Reference NameNoYesUniquely identifies the source for lineage and annotates the metadata.
SAP SuccessFactors Base URLYesYesThe base URL of SuccessFactors API.
Entity NameYesYesThe name of the Entity to be extracted. Doesn't support entities that have properties with the Binary data type or large volumes of data. For example,UserBadges andBadgeTemplates aren't supported.
SAP SuccessFactors UsernameYesYesThe user ID for authentication, similar toUSER_ID@COMPANY_ID. For example,sfadmin@cymbalgroup.
SAP SuccessFactors PasswordYesYesThe SAP SuccessFactors Password for user authentication.
Filter OptionsYesNoThe filter condition that restricts the output data volume, for example,Price gt 200. See thesupported filter options.
Select FieldsYesNoFields to be preserved in the extracted data. For example,Category,Price,Name,Address. If the field is left blank, then all the non-navigation fields will be preserved in the extracted data.

All fields must be comma (,) separated.
Expand FieldsYesNoList of navigation fields to be expanded in the extracted output data. For example,customManager. If an entity has hierarchical records, the source outputs a record for each row in the entity it reads, with each record containing an extra field that holds the value from the navigational property specified in the Expand Fields.
Associated Entity NameYesNoName of the Associated Entity that is being extracted. For example,EmpCompensationCalculated.
Pagination TypeYesYesThe type of pagination to be used. Server-side pagination uses snapshot-based pagination. If snapshot-based pagination is attempted on an entity that doesn't support the feature, the server automatically forces client-offset pagination on the query.
Examples of entities that only support server-side pagination areBadgeTemplates,UserBadges, andEPCustomBackgroundPortlet. No records are transferred if client-side pagination is chosen on these entities, as it relies on the Count API, which returns-1 as the response.

Default isServer-side Pagination.

Supported filter options

The following operators are supported:

OperatorDescriptionExample
Logical Operators
EqEqual/EmpGlobalAssignment?$filter=assignmentClass eq 'GA'
NeNot equal/RecurringDeductionItem?$filter=amount ne 18
GtGreater than/RecurringDeductionItem?$filter=amount gt 4
GeGreater than or equal/RecurringDeductionItem?$filter=amount ge 18
LtLess than/RecurringDeductionItem?$filter=amount lt 18
LeLess than or equal/RecurringDeductionItem?$filter=amount le 20
AndLogical and/RecurringDeductionItem?$filter=amount le 20 and amount gt 4
OrLogical or/RecurringDeductionItem?$filter=amount le 20 or amount gt 4
NotLogical negation/RecurringDeductionItem?$filter=not endswith(payComponentType, 'SUPSPEE_US')
Arithmetic Operators
AddAddition/RecurringDeductionItem?$filter=amount add 5 gt 18
SubSubtraction/RecurringDeductionItem?$filter=amount sub 5 gt 18
MulMultiplication/RecurringDeductionItem?$filter=amount mul 2 gt 18
DivDivision/RecurringDeductionItem?$filter=amount div 2 gt 18
ModModulo/RecurringDeductionItem?$filter=amount mod 2 eq 0
Grouping Operators
( )Precedence grouping/RecurringDeductionItem?$filter=(amount sub 5) gt 8

Data type mappings

The following table is a list of SAP data types with corresponding Cloud Data Fusion types.

SuccessFactors Data TypeCloud Data Fusion Schema Data Type
BinaryBytes
BooleanBoolean
ByteBytes
DateTimeDateTime
DateTimeOffsetTimestamp_Micros
DecimalDecimal
DoubleDouble
FloatFloat
Int16Integer
Int32Integer
Int64Long
SByteInteger
StringString
TimeTime_Micros

Use cases

The following example use case is the data for a single employee inEmployeePayrollRunResults:

Example propertyExample value
externalCodeSAP_EC_PAYROLL_1000_0101201501312015_456_416
Person ID456
Useruser-1
Employment ID416
Payroll Provider IDSAP_EC_PAYROLL
Start of Effective Payment Period01/01/2015
End of Effective Payment Period01/31/2015
Company IDBestRun Germany (1000)
Payout01/28/2015
CurrencyEUR (EUR)
Payroll Run TypeRegular (REGULAR)
System IDX0B

The example shows the results for an employee inEmployeePayrollRunResults:

EmployeePayrollRunResults_externalCodeEmployeePayrollRunResults_mdfSystemEffectiveStartDateamountcreatedBycreatedDate
SAP_EC_PAYROLL_2800_0101201901312019_305_2651/31/2019 0:00:0070923.9sfadmin12/10/2019 15:32:20
SAP_EC_PAYROLL_2800_0101201901312019_310_2701/31/2019 0:00:0064500sfadmin12/10/2019 15:32:20
SAP_EC_PAYROLL_2800_0201201902282019_305_2652/28/2019 0:00:0070923.9sfadmin12/10/2019 15:32:20
SAP_EC_PAYROLL_2800_0201201902282019_310_2702/28/2019 0:00:0064500sfadmin12/10/2019 15:32:20
SAP_EC_PAYROLL_2800_0301201903312019_305_2653/31/2019 0:00:0070923.9sfadmin12/10/2019 15:32:20

Example pipeline

See sample configurations in the following JSON file:

{"artifact":{"name":"data-pipeline-1","version":"DATA_FUSION_VERSION","scope":"SYSTEM"},"description":"","name":"Demo_SuccessFactors_BatchSource","config":{"resources":{"memoryMB":2048,"virtualCores":1},"driverResources":{"memoryMB":2048,"virtualCores":1},"connections":[{"from":"SAP SuccessFactors","to":"BigQuery"}],"comments":[],"postActions":[],"properties":{},"processTimingEnabled":true,"stageLoggingEnabled":false,"stages":[{"name":"SAP SuccessFactors","plugin":{"name":"SuccessFactors","type":"batchsource","label":"SAP SuccessFactors","artifact":{"name":"successfactors-plugins","version":"PLUGIN_VERSION","scope":"USER"},"properties":{"useConnection":"false","username":"${username}","password":"${password}","baseURL":"${baseUrl}","referenceName":"test","entityName":"${EmpCompensation}","proxyUrl":"${ProxyUrl}","paginationType":"serverSide","initialRetryDuration":"2","maxRetryDuration":"300","maxRetryCount":"3","retryMultiplier":"2","proxyUsername":"${Proxyusername}","proxyPassword":"${Proxypassword}"}},"outputSchema":[{"name":"etlSchemaBody","schema":""}],"id":"SAP-SuccessFactors"},{"name":"BigQuery","plugin":{"name":"BigQueryTable","type":"batchsink","label":"BigQuery","artifact":{"name":"google-cloud","version":"BIGQUERY_PLUGIN_VERSION","scope":"SYSTEM"},"properties":{"useConnection":"false","project":"auto-detect","serviceAccountType":"filePath","serviceFilePath":"auto-detect","referenceName":"Reff","dataset":"SF_Aug","table":"testdata_proxy","operation":"insert","truncateTable":"true","allowSchemaRelaxation":"true","location":"US","createPartitionedTable":"false","partitioningType":"TIME","partitionFilterRequired":"false"}},"outputSchema":[{"name":"etlSchemaBody","schema":""}],"inputSchema":[{"name":"SAP SuccessFactors","schema":""}],"id":"BigQuery"}],"schedule":"0 1 */1 * *","engine":"spark","numOfRecordsPreview":100,"rangeRecordsPreview":{"min":1,"max":"5000"},"description":"Data Pipeline Application","maxConcurrentRuns":1,"pushdownEnabled":false,"transformationPushdown":{}}}

What's next

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-12-15 UTC.