Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026! Turn data into bold moves, April 13 -15: Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
xx_emanis
Contributor
Contributor

Need help on how to get Incremental load based on three date columns

Hi All,

I have  a requirement where I need to load the target table as full load for the very first time and after the first run the subsequent loads should be incremental.

 

The incremental logic is based on three different columns and the logic is job needs to pull data from last three days when ever the job runs. Say like if the job runs at 6 am in the morning, it should pull the incremental data from the last three days.

CREATION_DATE
DATE_COMPLETED
SCHEDULED_COMPLETION_DATE 

 

Below is the screenshot of my job.

0683p000009M3Bo.png

 

Thank you.

 

Labels (2)
1 Solution

Accepted Solutions
manodwhb
Champion II
Champion II

@xx_emanis ,i suggest to you create two separate jobs. One for full load and another one for incremental

 

check the below link to process incremental load.

http://dataeng.ninja/etl/talend/2017/03/08/incremental-loading/.

 

View solution in original post

3 Replies
xx_emanis
Contributor
Contributor
Author

Some one please help me on this. This is kind of urgent.

manodwhb
Champion II
Champion II

@xx_emanis ,i suggest to you create two separate jobs. One for full load and another one for incremental

 

check the below link to process incremental load.

http://dataeng.ninja/etl/talend/2017/03/08/incremental-loading/.

 

Anonymous
Not applicable

Hi,

 

@manodwhb is absolutely correct as creating 2 separate jobs for full and delta will reduce the number of permutations and combinations. You can also do everything on same job also but in this case, you will have to pass the full load indicator flag as an additional parameter. And if its "Y" (using a Run If check), then do a truncate and load the target table. If its "N", you can do delta load (again, there could be insert or update. So please mark the key columns based on which insert/update will be performed).

 

   Another aspect you need to think is whether you can use Hash to do the data processing. If the input data load is too high, Hash may not be able to store the entire data set. So think about the possibility of using interim files and its is based on your input data load from the source.

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂