You are currently viewing Top 50 HCM Data Loader Interview Questions
Top 50 HCM Data Loader Interview Questions
- Top 50 HCM Data Loader Interview Questions
Total Visits: 217

In this article we will look into the most frequently asked HCM Data Loader Interview questions.

If you haven’t gone through the HCM Data Loader articles, please go through them to find most of the answers for the below questions.

Thank you to one of the audience of this website Phanindra Sajja for coming up with the answers to these questions. I have reviewed them and made some changes wherever necessary.

List of Questions and answers below:

1) Where do you configure the HCM Data Loader properties?
By using “Configure HCM Data Loader” task. Refer

2) What Role is required to access HCM Data Loader functionality?
Human Resource Integration Specalist

3) Where can we find the list of all business objects supported by HDL?
Under View Business Objects in Data Exchange.

4) How to generate the blank template for business objects?
Using Initate Data Loader we can generate the blank Template for the required business object. We need to click on “Refresh Object” button after selecting the required business object.

5) Why do we generate the blank template for business objects?
If the client has a requirement to import data from external system on period basis regarding organization or some user defined table which we are maintaining in the external system.
Example : When you define any DFF on the job page and the same info you are using in the third party system, we will have to generate template to get the headers for DFFs after DFFs are configured and deployed.

6) Can we reuse the HDL template used for one client to another client? What are the advantages and disadvantages?
Yes we can Use (Conditions Apply).
Advantage : No need to build or generate the HDL files from scratch.
Disadvantage: Additioanl attributes may vary from client to client. And the non mandatory atrributes might also vary depending on what client is using.
Example: Required locations attributes varies from country to country. Also the DFF,EFF attributes vary from client to client.

7) What are the types of keys supported by HDL?
Oracle Fusion Globally Unique Identifier (GUID)
Source key (Source system owner, Source System ID)
Oracle Fusion surrogate ID
User key

8) While creating new rows, can we use GUID and Surrogate ID on the HDL file?
No we can not use these GUID and Surrogate keys while creating New records, because those are created only after an object is created in fusion. We can only use the User Keys/Source Keys while creating new records.

9) What is the key resolution sequence if we specify more than one type of key on the HDL file?
Below is the Order of Key Sequence
1) GUID – Oracle Fusion Global Unique ID
2) Oracle Fusion Surrogate ID
3) Source Keys
4) User Keys
Reference :

10) How do you prepare the HDL file if client provides data in the excel sheet?
We can use Excel to HDL DAT File Converter.

11) What is the importance of Max threads for Import and Max threads for Load? Where are they seen on HDL

If we use more threads more objects will get processed and the data will get loaded without any performance issue. When we try to load historical information for employees define the threads and batch size.
Reference :

12) How do you debug/troubleshoot the errors in the HDL process?
When it comes to troubleshooting HDL errors there are two options:
Checking the data load errors/warnings from the Import and Load data screen. Using the Diagnostics test for HCM Data Loader(HDL Error Diagnostic Report)

13) Can we rollback the HDL file once it is loaded?
Yes. Oracle has provided this feature for ElementEntry business object only for now. We can initiate the ROLLBACK process once it is loaded. ROLLBACK will only delete the new element entries that were loaded as part of the load, it will not ROLLBACK the updates and deletes that were processed in that load. So be careful on the meaning of ROLLBACK.

14) Will the HDL staging tables get auto-purged? If Yes, at what frequency? if No, then how to delete them?
To purge the data from HDL staging tables, we need to schedule the process from the below navigation.
My Client Groups >> Data Exchange >> Delete stage table data >> Actions >> Schedule deletion

15) Is there any seeded HDL error report that can be used to generate HDL Errors?
HCM Data Loader Error Analysis Report

16) Can we extract HDL file in DAT format with only the error-ed HDL rows?
Yes we can by using ‘Enable File Generation failed file lines’ option in Configure HCM Data Loader. This feature came up from 20B release.

17) Where are the source keys, surrogate keys and GUID stored in the backend tables?

18) Can we update the source keys once they are set?
Yes. We can do it by using SourceKey.dat

19) Can we load the Fast Formulas using HDL? If yes where do you enter the Fast Formula code?
Create formula with effective start dates on or before the start dates of other objects that refer to your formula. Supply the Fast Formula text in a separate text file and then pass the text file name to the Comments attribute. Further, place the formula text file in the ClobFiles folder within the same compressed file as FastFormula.dat file that references it. Also, dont supply the text directly in the FastFormula.dat file.

This example loads the Fast Formula.
MERGE|FastFormula|VISION|MGR_SCHED_HRS|2000/01/01|MGR_RANGE_SCHD_HRS|Range of Scheduled Hours|Manager Range of Scheduled Hours|ManagerRangeScheduledHrs.txt|

20) Can we replace the effective end date of the row using HDL after it is created?
Yes we can change. We need to add the attribute ReplaceLastEffectiveEndDate. However not all objects support this feature. Check business object documentation and oracle support.

21) Can we add new rows in the history without impacting the current row for a business object?
Yes we can but we need to make sure the effective dates should not conflit with current existing record.

22) What is the use of SET commands in HDL?
SET commands are used to override the default behavior of HCM Data Loader and can be effectively used for achieving requirement.

23) Can we use the delimiter as comma (,) instead of the pipe (|)?
You can override this parameter with a set instruction followed by the override value. Place SET instructions before the first METADATA line in your business object file.
To change the delimiter value you change the ‘File Delimeter’ parameter using “Configure HCM Data Loader” task.

24) How to invoke the post processes after the HDL load is complete?
By default, below processes run automatically after we load worker.dat file:
Refresh Manager Hierarchy
Update Person Search Keywords

Reference :

25) Can we create users using the Worker HDL?
Yes we can mention the username details on the User object attributes in Worker.dat

26) What are the minimum objects that needs to be loaded in order for an employee to get loaded using Worker business object?
Worker,Person Name,Work relationship,Work Terms,Assignment

27) How to load NULL values using HDL?
Pass #NULL value to Attribute. If you do not provide this value, then the system may default the value based on the seeded logic.

28) Where do you define the Source System Owner to be used in HDL?
Define source system owner in HRC_SOURCE_SYSTEM_OWNER lookup
Login to Fusion Applications > Click on Navigator > Setup and Maintenance
Search for task “Manage Common Lookups” Search for the Lookup Type of “HRC_SOURCE_SYSTEM_OWNER”

29) What are the seeded Source System Owners available by default?
By Default we get two Source System Onwers (FUSION and HRC_SQLLOADER). Whatever transactions that we do on the application will have the Source System Owner as FUSION by default.
Default Source System ID would be the Surrogate ID.

30) If we add data in the application what would be the Source System Owner that will get stored in the application?
FUSION is the default Source System Onwer and Surrogate ID is the default Source System ID.

31) Can we upload multiple business objects in a single zip file using HDL?
Yes we can load multiple business objects in a single zip file using HDL. HDL has the inbuilt functionality to determine the sequence of business objects in which it needs to process and it will process those business objects according to it.

32) If we are loading multiple dat files in single zip file can we mention the sequence of execution of business objects?
No we can not. HDL has the inbuilt functionality to determine the sequence of business objects in which it needs to process and it will process those business objects according to it.

33) Can we load data into the DFF, EFF and KFF fields? What would be the approach?
Using HCM Data Loader, you can load data for both descriptive flexfields and extensible flexfields
When loading flexfield data, you must supply the flexfield code in the METADATA line in this format

Flexfield attribute names are those that you specify when configuring the flexfield. Both descriptive and extensible flexfields have one or more contexts. When you include a flexfield attribute name on the METADATA line for an object, you must also identify the context.
Some business object components support multiple descriptive flexfields. You can include all descriptive flexfield attributes for every flexfield and configured context on a single METADATA line.

34) Can we invoke the HDL process using SOAP services?
Yes we can http://{Host}/hcmCommonDataLoader/HCMDataLoader?wsdl Method

35) Do we have any payroll flow to run the HDL process for automation purposes?
Load Data from File – For loading HDL using HDL Transformation Formula
Initiate Data Loader – Standalone HDL load to run it based on the content id

36) How to check the timing of the HDL process on how much time it took to load the data?
There is no easy way to do it. We need to create a SQL Query which returns this information for the input content id.

37) Can we only import data to HDL staging tables to check if there are any import errors without loading it?
When you automate data loading using the HCM Data Loader web service, you can set File Action to Import only or Import and load. If you select Import only, then you load the imported objects manually on the Import and Load Data page.

38) How to migrate the data from one POD to another using HDL?
We cannot migrate data as such using HDL. However BI Reports can be built for each business object to export data in DAT file format using etext templates and the exported DAT files can be imported in the target POD.

39) What is the importance of Transfer Group Size while submitting HDL process?
Transfer Group size/Chunk Size defines the chunks the employees/business objects are split into when we use multi-threading to parallelly load the HDL file. This setting does not work individually. This works along with Max threads for Import and Max threads for Load.

40) What is the difference between HDL and HSDL? If we load data using HSDL can we check those details on Import and Load data screen?
HDL is for bulk data loading. And we can use user keys and source system keys in HDL. HSDL is for end user in an excel friendly format, where in the user can enter details in the excel sheet and upload the file. However in backend even HSDL uses the HDL engine and the details can be found on Import and Load data screen.

41) Can we load encrypted files using HDL?
Yes. We can provide the encryption details while submitting the HDL file.

42) Can we stop processing of HDL if there are some 20% of errors in the processed HDL file? If yes how to configure it?
Yes we can by set the ‘Maximum Percentage of Load Errors’ value in “Configure HCM Data Loader“. And we can also override it when we submit the file for loading using HDL.

43) Can we send a notification to a user whenever a HDL process fails?
No. This functionality is currently not available by default. You need to develop a report and schedule it using bursting to notify the interested audience.

44) Can we load the translation information for supported business objects? In what sequence do we need to load the main business object and translation business object?
First we need to load the base as english language descriptions and then we can load the descriptions for other languages using the translation dat files for supported business objects.

45) Can we automate the process of data load using HDL in an HCM Extract?
Yes. We can use the Inbound Interface type HCM Extracts with delivery option as Inbound Interface and prepare the DAT file content using etext template. Setup the business object dat file name and zip file name in delivery options and finally add the Initiate Data Loader task to the HCM Extract Payroll Flow. Once the extract is run, it will generate data, converts it to HDL format and invokes the HDL process.

46) What are protective data sets in HDL?
Protective data sets were recently released for HDL. Element Entry is the only object which is supported for ROLLBACK and hence is also a Protective Data Set. Protective Data Sets have a different setting for purging staging tables. From 20B, we can override this setting and even purge protective data sets even before they expire.

47) Can we update the User keys of the business objects after they are loaded?
Yes. We can update some user keys for business objects. We need to use the Source Keys while trying to update user keys.

48) Is there any extract to get the Source Key Information for all business objects?
Yes. “Integration Object User Key Map Extract” is the extract that can be used to find out the information of source keys.

49) Can we upload the html formatted text using HDL for the business objects which support CLOB fields?
Yes we can load the html formatted text if you want to view those formatting online. You need to copy the formatted text to a separate text file and put it in ClobFiles folder and provide the file name in the business object DAT file. Then zip both ClobFiles and business object dat file into a single zip file and upload it.

50) If we get input in a flat file can we convert it into HDL file using any automated method?
Yes. HCM Data Loader Transformation Formula has been introduced in 19C. And it can read the flat file and convert it to HDL DAT file and invoke the HDL process.