Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
ka51
Partner - Contributor
Partner - Contributor

Incremental load

Hi Team,

Need a help here, this might be a simple question .

I am doing an incremental load of 270 million records which is for duration of 3months. While concatenating the new data with history Qvd it is taking more time to load as history Qvd is having more data.

Is there any solution where i should not be loading again the history Qvd, only can concatenate to the new data.

Would be grateful if any ideas can be posted here.

 

Labels (1)
6 Replies
Eduardo_Monteiro
Partner - Creator II
Partner - Creator II

Hi @ka51 

What about separating .qvd per period?

Regards,

Eduardo Monteiro - Senior Support Engineer @ IPC Global
Follow me on my LinkedIn | Know IPC Global at ipc-global.com

vighnesh_gawad
Partner - Creator
Partner - Creator

While concatenating these two QVDs, are you applying any transformations? If yes, that’s likely why it’s taking more time.

You can reduce the load time by loading the history QVD in optimised mode and avoiding any transformations during concatenation.

Regards, Vighnesh Gawad
Connect with me on LinkedIn | GitHub
ka51
Partner - Contributor
Partner - Contributor
Author

As we are maintaining rolling 3 months of historical data ,hence applying a filter for the same while concatenating.

marcus_sommer

Take a more careful look on the above suggestions. Loading qvd-data optimized (no transformations unless a single where exists(OnlyWithOneParameter); is really fast (the kind of filtering matters).

In addition and/or as an alternative the historical data could be sliced and the slice-information be included within the file-name. Afterwards this file-information could be read before the data are touched, for example:

for each file in filelist('path/*.qvd')
   if subfield(subfield('$(file)', '.', 1), '_', -1) >= MyPeriodInformation then
     t: load ...;
   end if
next

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

As others have suggested you may not have an optimized load in your concatenate. If you post the script we may have some suggestions. Here are some general ideas to maintain the optimized load.

Load the history QVD first with a Where Exists(trandate) pattern to roll off old data. Then concatenate the updated rows to this resident table. https://qlikviewcookbook.com/2026/02/optimized-load-script-patterns/

If the history QVD is just too large here are some script patterns for segmenting your QVD. https://qlikviewcookbook.com/2022/03/how-to-segment-qvd-files/

-Rob

ka51
Partner - Contributor
Partner - Contributor
Author

Sure, will have a look. Also will try to paste the code here.