Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have a following use-case that i am trying to implement using Talend DI.
I have to read the data from a source table, join it with a lookup table, update the source table with join-success results (Join:inner) and finally delete the join-rejects from the source table and load it in other table.
Current structure
LKP_TBL
|
|
SRC_TBL----------tMap-----------SRC_TABLE (Update operation on join results)
|
| {Inner Join Rejects }
tReplicate----------SRC_TBL(Delete operation)
|
|
TGT_TBL2 (Insert operation)
Database use snowflake.
When i am executing this flow, it is yielding weird results like some of the records are getting updated and some are not.
Can you please suggest a better architecture to implement this?
Please note: I tried using tHashOutput , bust as the record set is huge, i am getting the java heap space issue.
Thanks in advance !!!
Due to formatting issue, i couldn't post the structure properly, please find the below image for reference:
This sounds odd and that maybe your logic inside the job might need to be looked at. But I can help with regard to not being able to use the tHash components due to memory issues. Take a look here (https://help.talend.com/r/en-US/8.0/open-studio-user-guide/specifying-limits-of-vm-memory-for-job-or-route) about increasing the memory available to the job.
On that note, I'd like to recommend you try using the tHSQLDB components. These create a database in memory for you to use. I have put together an example in this post (https://community.talend.com/s/feed/0D75b000005yRDCCA2).
I'd start by changing the memory settings and attempting to load the data into the tHSQLDB components. Then check the data you are getting.