Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Parikhharshal
Creator III
Creator III

Job Design

Hi Talend experts

 

I am trying to create a job where I have got 20 columns in one table where I have written specific logic for updating deleted_flag which is designed in separate flow.

Now I am trying to get these records back into original table and source additional columns from table and stores them in table. How do I design my job?

 

0683p000009Lznv.png

 

Just to give a little background MergeintoHistory is where all my 20 columns are stored and for specific deleted_flag column I have written separate logic in below flow from componentsok. Now my end result is I should be able to source 19 additional columns from mergeintohistory and one column from below flow logic (ie deleted_flag) and get them all together in same table.

 

Hopefully this makes sense. 

 

Help would be really appreciated!

 

Thanks

Harshal. 

Labels (3)
1 Solution

Accepted Solutions
vapukov
Master II
Master II


@Parikhharshal wrote:
 I want to use table output as input. How do i do?

Thanks
Harshal.

HI Harshal

 

You already use tHashOutput/tHashInput

this is a answer for small (very relative terms) data-sets 

as alternative You can use local (for Talend) csv file or database table

for example - You operate just with 10000 rows, all fine use memory hash, but if You compare 2 million of rows, it could be issues with memory, especial in multi jobs environment - use local csv file. it fast and not consume memory

 

regards, Vlad

View solution in original post

3 Replies
vapukov
Master II
Master II

do You have some technical issues?

or You search for architecture advises?

 

if architecture design:

not clean - how many rows could come from redshift and saleforce ?

it could be affected and for speed (execution time) and for memory consumption (tHash)

 

Parikhharshal
Creator III
Creator III
Author

Hi Vapukov

I do not have issue with architecture or technical design. It is more of design implementation. And Just needing help on how to design? I want to use table output as input. How do i do?

Thanks
Harshal.
vapukov
Master II
Master II


@Parikhharshal wrote:
 I want to use table output as input. How do i do?

Thanks
Harshal.

HI Harshal

 

You already use tHashOutput/tHashInput

this is a answer for small (very relative terms) data-sets 

as alternative You can use local (for Talend) csv file or database table

for example - You operate just with 10000 rows, all fine use memory hash, but if You compare 2 million of rows, it could be issues with memory, especial in multi jobs environment - use local csv file. it fast and not consume memory

 

regards, Vlad