How HDL Transformation Formula works with ElementEntry Sample
How HDL Transformation Formula works with ElementEntry Sample

 785 total views

In this article we will look into how the HDL Transformation Formula works along with a sample usecase to load Element Entry object..

If you haven’t checked the previous article on basics of Transformation formula, please check it out.

This is a guest post by my friend Sainath Reddy who writes Fusion related content on his blog https://oraclecloudhcmtechfunc.blogspot.com/. He has written some great content on both technical and functional side. Have a look at them.

If we have the input data in flat files i.e text, csv, dat etc., and then if we have to load that data into Fusion system then we need to use the HDL FF which converts the flat file into HDL format and then loads it into Fusion.

Now let’s look at the building blocks of HDL Transformation Formula before jumping on to how we do it.

Building Blocks for HDL Transformation Formula:

Building Blocks for HDL Transformation Formula are:
Input Variables
Return Statements

Input Variables available in HDL FF:

OPERATION, FileName, FileDiscriminator, LINENO and LINEREPEATNO are Input Variables available for this formula type.
Depending on the input file we need to add the extra input variables POSITION1, POSITION2, …, POSITIONn one for each input column.

Example:

E509|GRADE_CHANGE|2019/12/08|G1
E512|GRADE_CHANGE|2020/11/10|G2

Input file layout
Input file layout

Now, we will look at how these variables are defined in FF and what’s the significance of them.

  1. OPERATION
  2. FileName
  3. FileDiscriminator
  4. LINENO
  5. LINEREPEATNO

1. Operation:

This is a seeded variable provided by Oracle. Even PBL FF used to have this variable. The valid values for this variable OPERATION that we can use are:

FILETYPE,DELIMITER,READ,MAP,NUMBEROFBUSINESSOBJECTS,METADATALINEINFORMATION

  • FILETYPE –> OUTPUTVALUE –> DELIMITED

For FILETYPE Operation output value is DELIMITED

  • DELIMITER –> OUTPUTVALUE –>   |

For DELIMITER Operation output value is |

If we are using the comma separated flat file then use ‘,’.

Note: By default it is ‘|’

  • READ –> OUTPUTVALUE –> NONE

For READ Operation output value is NONE

  • MAP –> Business Object Attributes –> returns the business object attributes

For MAP Operation output value is the business object attributes

  • NUMBEROFBUSINESSOBJECTS –> OUTPUTVALUE –>Ex: 1

For NUMBEROFBUSINESSOBJECTS Operation output value is 1

In Flat file if we need to create the 2 business objects then give this as 2.

  • METADATALINEINFORMATION –> METADATAn –> Ex:METADATA1

For METADATALINEINFORMATION Operation output value is METADATAn

If we need the two METADATA’s then we have to create METADATA1,METADATA2

2. FileName: Business object file Name

3. FileDiscriminator: Business Object file discriminator

4. LINENO : It gives the which number.

5.LINEREPEAT And LINEREPEATNO: When we want to process the same line for multiple times then we will use these.

Example:
FileName = ‘ElementEntry’
FileDiscriminator = ‘ElementEntryValue’
BusinessOperation = ‘MERGE’

LINEREPEAT allows a single line of input to be processed multiple times. If it is ‘Y’ then one more time it will be processed. If it is ‘N’ then it stops processing that particular line.

LINEREPEATNO indicates the number of repetitions.If a line in a file is processing first time it will be ‘1’. And if the same line processing second time it is ‘2’.

BusinessOperation :  This is used for MERGE/DELETE/COMMENT operations to be performed in HDL.

Example:

IF LINEREPEATNO=1 THEN
(
BUSINESSOPERATION='MERGE'
BUSINESSOBJECT='ELEMENT ENTRY'
FILENAME = 'ELEMENTENTRY'
FILEDISCRIMINATOR ='ELEMENTENTRY'
LINEREPEAT = 'Y'
RETURN BUSINESSOPERATION,FILEDISCRIMINATOR,FILENAME
)
ELSE
(
BUSINESSOPERATION='MERGE'
BUSINESSOBJECT='ELEMENT ENTRY VALUE'
FILENAME = 'ELEMENTENTRY'
FILEDISCRIMINATOR = 'ELEMENTENTRY'
LINEREPEAT = 'N'
RETURN BUSINESSOPERATION,FILEDISCRIMINATOR,FILENAME
)

If the Element entry details and Element entry value details both are stored in the same line, then we need to process the same line twice. So LINEREPEAT is ‘Y’ for Element Entry and ‘N’ for Element Entry Value.  If ‘Y’ is there it will process the same line again, for ‘N’ it will go to the next line after it process that line.

Return values for this Formula type:

RETURN BusinessOperation, FileName, FileDiscriminator, Attribute1, Attribute2, …, Attributen

Mapping for Source Keys:

Syntax for PersonId(SourceSystemId) :  “PersonId(SourceSystemId)”= POSITIONn

Ex: “PersonId(SourceSystemId)”= POSITION21

Return values for Source Keys example:

RETURN BusinessOperation,FileName,FileDiscriminator,Attribute1,…,”PersonId(SourceSystemId)”,…,Attribute25
“PersonId(SourceSystemId)” Should be in double quotes

Roles and Privileges required to use HDL FF:

The privileges that are required to access:
Submit payroll flow – PAY_SUBMIT_PAYROLL_FLOW_PRIV
Load data from File – HRC_LOAD_HCM_DATA_PRIV

Human Capital Management Integration Specialist role is required for checklists, Data Exchange and to load data from a file flow pattern.

Enabling the File Encryption:

If the input file is encrypted, then follow the below process to setup the encryption setting. If input file is not encrypted also, check this setting once to ensure it is having NONE value.

Before loading the encrypted file, ensure the parameter is set in payroll process configuration.

My Client Groups > Quick actions > Payroll > Payroll Process Configuration

Manage Payroll Process Configuration task
Manage Payroll Process Configuration task

In Default Groups > Select the parameter > Payroll Batch Loader Encryption Type Valid Values are PGPSIGNED, PGPUNSIGNED and PGPX509UNSIGNED. Default No encryption.

Encryption setting under Default Group
Encryption setting under Default Group

Complete Process to for HDL Transformation Formula:

  1. Create and Upload the flat file into content server manually or by using web services.
  2. Create the HDL Transformation Formula for HCM Data Loader.
  3. Run the Load Data from file payroll flow manually or using webservices.
Flow of HDL FF

Create Input File and upload to UCM:

Sample Input File:
E509|GRADE_CHANGE|2019/12/08|G1
E510|GRADE_CHANGE|2019/12/10|G2

Upload file to content Server from Navigator >Tools > File Import and Export

File Import and Export
Tools > File Import and Export

Click add > upload the input file and select account as hcm/dataloader/import  > save and close

Upload input file
Upload input file

It generates the Content ID which we will use it while submitting the payroll flow. Keep a note of it.

Content ID generated for input file
Content ID generated for input file

Create a Fast Formula of type HCM Data Loader:

Navigate to > Setup and Maintenance > Fast Formula > Create New formula > Select type HCM Data Loader

Creating HDL Transformation FF
Creating HDL Transformation FF

The final FF after implementing all of the functionality will be like below:

/* HDL Transformation Formula to load Element Entry and Entry Value information from Input File */
INPUTS ARE OPERATION (text),
LINENO (number),
POSITION1 (text),
POSITION2 (text),
POSITION3 (text),
POSITION4 (text),
LINEREPEATNO (number)
DEFAULT FOR LINEREPEATNO IS 1
DEFAULT FOR LINENO IS 1
DEFAULT FOR POSITION1 IS 'NO DATA'
DEFAULT FOR POSITION2 IS 'NO DATA'
DEFAULT FOR POSITION3 IS '1900/01/01'
DEFAULT FOR POSITION4 IS 'NO DATA'
IF OPERATION='FILETYPE' THEN
OUTPUTVALUE='DELIMITED'
ELSE IF OPERATION='DELIMITER' THEN
OUTPUTVALUE='|'
ELSE IF OPERATION='READ' THEN
OUTPUTVALUE='NONE'
ELSE IF OPERATION = 'NUMBEROFBUSINESSOBJECTS' THEN
(
OUTPUTVALUE = '2'
RETURN OUTPUTVALUE
)
ELSE IF OPERATION = 'METADATALINEINFORMATION' THEN
(
METADATA1[1] = 'ElementEntry'/FileName/
METADATA1[2] = 'ElementEntry'/FileDiscriminator/
METADATA1[3] = 'EffectiveStartDate'
METADATA1[4] = 'MultipleEntryCount'
METADATA1[5] = 'AssignmentNumber'
METADATA1[6] = 'CreatorType'
METADATA1[7] = 'ElementName'
METADATA1[8] = 'LegislativeDataGroupName'
METADATA1[9] = 'EntryType'

METADATA2[1] = 'ElementEntry' /*FileName*/ 
METADATA2[2] = 'ElementEntryValue' /*FileDiscriminator*/
METADATA2[3] = 'LegislativeDataGroupName' 
METADATA2[4] = 'EffectiveStartDate' 
METADATA2[5] = 'EffectiveEndDate' 
METADATA2[6] = 'ElementName' 
METADATA2[7] = 'MultipleEntryCount' 
METADATA2[8] = 'AssignmentNumber' 
METADATA2[9] = 'InputValueName' 
METADATA2[10] = 'ScreenEntryValue' 
METADATA2[11] = 'EntryType' 
RETURN METADATA1,METADATA2 )

ELSE IF OPERATION = 'MAP' THEN
(
/*Batch Related Outputs*/
FileName = 'ElementEntry'
LegislativeDataGroupName = 'IN Legislative Data Group'
EffectiveStartDate = POSITION3 
EffectiveEndDate = '4712/12/31' 
AssignmentNumber = POSITION1 
InputValueName = 'test' 
MultipleEntryCount = 1 
EntryType = 'E' 
change = POSITION2 
BusinessOperation = 'MERGE'

/*METADATA|ElementEntry|EffectiveStartDate|MultipleEntryCount|AssignmentNumber|CreatorType|ElementName|LegislativeDataGroupName|EntryType
MERGE|ElementEntry|2019/12/08|1|E509|H|ABC Element Name|IN Legislative Data Group|E */

IF( LINEREPEATNO=1 ) THEN
(
IF(change='GRADE_CHANGE') THEN
(
FileDiscriminator = 'ElementEntry'
CreatorType = 'H'
ElementName = 'ABC Element Name'
LINEREPEAT = 'Y'
BusinessOperation = 'MERGE'
)
RETURN BusinessOperation,FileName,FileDiscriminator,EffectiveStartDate,MultipleEntryCount,AssignmentNumber,CreatorType,ElementName,LegislativeDataGroupName,EntryType,LINEREPEAT,LINEREPEATNO
)

/*METADATA|ElementEntryValue|LegislativeDataGroupName|EffectiveStartDate|EffectiveEndDate|ElementName|MultipleEntryCount|AssignmentNumber|InputValueName|ScreenEntryValue|EntryType MERGE|ElementEntryValue|IN Legislative Data Group|2019/12/08|4712/12/31|ABC Element Name|1|E509|test|Grade Change|E*/

IF( LINEREPEATNO=2 ) THEN
(
IF(change='GRADE_CHANGE') THEN
(
FileDiscriminator = 'ElementEntryValue'
ElementName = 'ABC Element Name'
ScreenEntryValue = 'Grade Change'
LINEREPEAT = 'N'
BusinessOperation = 'MERGE'
)
RETURN BusinessOperation,FileDiscriminator,FileName,LegislativeDataGroupName,EffectiveStartDate,EffectiveEndDate,ElementName,MultipleEntryCount,AssignmentNumber,InputValueName,ScreenEntryValue,EntryType,LINEREPEAT,LINEREPEATNO
)
)

ELSE OUTPUTVALUE = 'NONE'
RETURN OUTPUTVALUE

Run the Load Data from File Flow:

If you are on Classic UI, Navigate to Payroll > Payroll checklist > submit a payroll flow > select Load data from file

If you have enabled RUI, the navigation would be

My Client Groups > Payroll > Submit Flows > select Load data from file

Submitting flow

Provide the Content Id and the Transformation Fast Formula from the above steps

Inputs to Flow
Inputs to Flow

Once the process runs to success, you can check the HDL file it has generated from File Import and Export

The generated file will look like:

Generated HDL file

Generated HDL file
METADATA|ElementEntry|EffectiveStartDate|MultipleEntryCount|AssignmentNumber|CreatorType|ElementName|LegislativeDataGroupName|EntryType
MERGE|ElementEntry|2019/12/08|1|E509|H|ABC Element Name|IN Legislative Data Group|E
MERGE|ElementEntry|2019/12/10|1|E510|H|ABC Element Name|IN Legislative Data Group|E

METADATA|ElementEntryValue|LegislativeDataGroupName|EffectiveStartDate|EffectiveEndDate|ElementName|MultipleEntryCount|AssignmentNumber|InputValueName|ScreenEntryValue|EntryType
MERGE|ElementEntryValue|IN Legislative Data Group|2019/12/08|4712/12/31|ABC Element Name|1|E509|test|Grade Change|E
MERGE|ElementEntryValue|IN Legislative Data Group|2019/12/10|4712/12/31|ABC Element Name|1|E510|test|Grade Change|E

Now, let’s check the status of the HDL process on Data Exchange > Import and Load data screen.

Now you can verify the element entries for the employees and confirm that the entries are loaded successfully.

Hope you had a good read. If you have tried this Transformation Formula and have faced issues/have something to share, please do so in comments section.

Join our fastest growing technology social groups to interact with experts in the Fusion space and get your queries resolved – Telegram group, Facebook Page and LinkedIn Group

The following two tabs change content below.
Sricharan is a Fusion HCM Solution Architect with over 13+ years of overall experience and 5+ years of Fusion experience. He is passionate about technical aspects of Fusion HCM and writes interesting articles on HCM Extracts, HCM Data Loader, Fast Formula, BI Publisher, Integrations and Automation etc.

This Post Has One Comment

  1. Mohan

    Nice Article Charan.

Leave a Reply