Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tFileInputMSPositional join data between records

Hello,

 

I'm using the tFileInputMSPositional to read a file that contains different positional records. In this file, exist some records that need to be joined with others, because one is the main and the others are like details of this one. Does anyone know how I can do this?

 

Thank you!

Labels (3)
1 Solution

Accepted Solutions
TRF
Champion II
Champion II

Redirect each record type to a dedicated tHashOutput.

Start a new subjob with the tHashInput associated to the detailed or children records, add a tMap with the tHashInput related to the main or parent records as the lookup.

Add an inner join in the tMap using the appropriate fields and map the desired flieds to the output flow.

This is how the job should look like:

0683p000009M6SX.png

Suppose you have the following content:

110Name 10   
210100
210101
210102
120Name 20   
220200
220201
220202

Record type is identified with 1rst character for all records (1 = parent, 2 = child).

2nd field is parentId (length 2) for all records.

3rd filed is parentName for parent records and childId for children records.

This is defined like this in tFileInputMSPositional component:

 0683p000009M8rk.png

The tMap component is in charge to join both parents and children records to produce an output with parentId, childId and parentName fields:

0683p000009M97h.png

And the result is as expected:

.--------+-------+----------.
|         tLogRow_1         |
|=-------+-------+---------=|
|parentId|childId|parentName|
|=-------+-------+---------=|
|10      |100    |Name 10   |
|10      |101    |Name 10   |
|10      |102    |Name 10   |
|20      |200    |Name 20   |
|20      |201    |Name 20   |
|20      |202    |Name 20   |
'--------+-------+----------'

View solution in original post

6 Replies
TRF
Champion II
Champion II

Redirect each record type to a dedicated tHashOutput.

Start a new subjob with the tHashInput associated to the detailed or children records, add a tMap with the tHashInput related to the main or parent records as the lookup.

Add an inner join in the tMap using the appropriate fields and map the desired flieds to the output flow.

This is how the job should look like:

0683p000009M6SX.png

Suppose you have the following content:

110Name 10   
210100
210101
210102
120Name 20   
220200
220201
220202

Record type is identified with 1rst character for all records (1 = parent, 2 = child).

2nd field is parentId (length 2) for all records.

3rd filed is parentName for parent records and childId for children records.

This is defined like this in tFileInputMSPositional component:

 0683p000009M8rk.png

The tMap component is in charge to join both parents and children records to produce an output with parentId, childId and parentName fields:

0683p000009M97h.png

And the result is as expected:

.--------+-------+----------.
|         tLogRow_1         |
|=-------+-------+---------=|
|parentId|childId|parentName|
|=-------+-------+---------=|
|10      |100    |Name 10   |
|10      |101    |Name 10   |
|10      |102    |Name 10   |
|20      |200    |Name 20   |
|20      |201    |Name 20   |
|20      |202    |Name 20   |
'--------+-------+----------'
Anonymous
Not applicable
Author

Thank you @TRF,

 

I think this gone work for me. I will try your solution and give a feedback if it works.

TRF
Champion II
Champion II

Did this work?

If so thanks to mark your case as solved (Kudos also accepted).

Anonymous
Not applicable
Author

Hello @TRF,

 

Yeah, it works.

 

One more thing. This solution is good for large amount of data? The files that I will receive contains more than 10 million of register with several columns (one type of register has 100 columns for example). It will be a copybook data, and I will need to load this file daily.

 

Thank you!

TRF
Champion II
Champion II

It should work but I've never tried with very huge files. If you encounter memory issues, try to change Xmsx JVM parameter for the job or replace tHashOutput/input by delimited files.
Anonymous
Not applicable
Author

Ok.

 

Thank you for the quickly reply @TRF.