Share:

PREVIOUS  |  NEXT

Perform an Inbound File Transfer (Example: Load Transaction)


In an inbound transfer, data will be copied from the source system/flat files and loaded into the Commissions tables.

Note: Data can be loaded from various data sources. As an example, the procedure to transfer data from flat files is illustrated in this documentation. Any SDI compatible template can be used to create flat files. 

Steps

The following image illustrates the steps involved in an Inbound File Transfer.


To perform an inbound transfer:

  1. Copy file for Transfer.
    1. Log in to the DP Server Agent.
    2. Copy file for transfer. (Example: Transaction)
  2. Create Remote Source (if it does not exist).
    1. Log in to WEB IDE and create a Remote Source (Inbound File) for File Adapter.
  3. Create Virtual Table.
    1. Create a virtual table for the file copied in step 1 (Transaction) using the remote source Inbound File.
    2. Open DB Explorer and login to EXT schema.
  4. Create a Custom EXT Table.
    1. Create a custom table in EXT for the file copied in step 1. (Example: Transaction)
  5. Create FlowGraph #1 (Database folder):
    1. Create a Synonym for the respective table from EXT. (Example: Transaction)
    2. Create a FlowGraph and define Virtual table as the Source and the Synonym created as the Target.
    3. In between the source and target, introduce a Procedure as a Transformation (develop this procedure in EXT schema).
    4. Build the FlowGraph and Execute.
  6. Create FlowGraph #2(Database folder):
    1. Create a Synonym for the respective table from TCMP. (Example: CS_STAGESALESTRANSACTION)
    2. Create a FlowGraph and define Synonym for EXT.table as source and Synonym for TCMP table as target. (Example: EXT.Transaction as source and CS_STAGESALESTRANSACTION as target)
    3. Include Transformations.
    4. Build the FlowGraph and Execute.
  7. Execute the Commissions Pipeline by calling the Commissions API service with the respective payload using Google Postman or any REST API Client tools.     

ODATA API Details

URL: https://<commissionshostname>:447/TrueComp-SaaS/CommissionsService.svc/PipelineRuns
Method Type: POST
Basic Authorization: Commission Admin Username and Password


Sample Payload Validate and Transfer
{
"StartDateScheduled": "2019-06-06T14:57:00Z",
"TraceLevel": null,
"SkipAnalyzeSchema": null,
"SqlLogging": null,
"DebugContext": null,
"Command": "Import",
"StageType": "ValidateAndTransfer",
"CalendarName": "Main Monthly Calendar",
"Period": {
"Calendar": {
"Name": "Main Monthly Calendar"
},
"Name": "April 2019"

},
"BatchName": "BATCH1",
"RunMode": "all",
"Module": "OrganizationData"

     }

CalendarName StringCalendar name
StageType
String
Stage Type name
Revalidate
String
ReValidate the relevant stage records. Valid values are : all, onlyError
TraceLeveloptional
String
Options to trace(log) the pipeline process. Available options are 'profile'-Performance Statistics,'internal'-Verbose Logging. You can pass multiple option as comma separated string.
-String Default value: status
SqlLoggingoptional
BooleanAllow sql logging for Pipeline process.
-Boolean
Default value: false
StartDateScheduledoptional
String
Date when pipeline job starts. Only passed to submit job for future date.
UserIdoptional
String
User who submitted the pipeline job.
BatchName
String
Reset only specific data which are imported with this batchName.
Module String
Reset only specific data which are from module specified by Module name. Valid values are: TransactionalData, OrganizationData , ClassificationData, PlanRelatedData or PlanRelatedData
RunMode String
Run mode of the import job. i.e. all



To monitor load status and logs, see Monitor Tasks.

Note: Calling the ODATA adapter to execute the Commissions pipeline can be part of a flowgraph and scheduled in a task chain. Refer to Commissions documentation for more information. 



Related Articles