Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
ka51
Partner - Contributor
Partner - Contributor

Incremental load

Hi Team,

Need a help here, this might be a simple question .

I am doing an incremental load of 270 million records which is for duration of 3months. While concatenating the new data with history Qvd it is taking more time to load as history Qvd is having more data.

Is there any solution where i should not be loading again the history Qvd, only can concatenate to the new data.

Would be grateful if any ideas can be posted here.

 

Labels (1)
9 Replies
Eduardo_Monteiro
Partner - Creator II
Partner - Creator II

Hi @ka51 

What about separating .qvd per period?

Regards,

Eduardo Monteiro - Senior Support Engineer @ IPC Global
Follow me on my LinkedIn | Know IPC Global at ipc-global.com

vighnesh_gawad
Partner - Creator
Partner - Creator

While concatenating these two QVDs, are you applying any transformations? If yes, that’s likely why it’s taking more time.

You can reduce the load time by loading the history QVD in optimised mode and avoiding any transformations during concatenation.

Regards, Vighnesh Gawad
Connect with me on LinkedIn | GitHub
ka51
Partner - Contributor
Partner - Contributor
Author

As we are maintaining rolling 3 months of historical data ,hence applying a filter for the same while concatenating.

marcus_sommer

Take a more careful look on the above suggestions. Loading qvd-data optimized (no transformations unless a single where exists(OnlyWithOneParameter); is really fast (the kind of filtering matters).

In addition and/or as an alternative the historical data could be sliced and the slice-information be included within the file-name. Afterwards this file-information could be read before the data are touched, for example:

for each file in filelist('path/*.qvd')
   if subfield(subfield('$(file)', '.', 1), '_', -1) >= MyPeriodInformation then
     t: load ...;
   end if
next

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

As others have suggested you may not have an optimized load in your concatenate. If you post the script we may have some suggestions. Here are some general ideas to maintain the optimized load.

Load the history QVD first with a Where Exists(trandate) pattern to roll off old data. Then concatenate the updated rows to this resident table. https://qlikviewcookbook.com/2026/02/optimized-load-script-patterns/

If the history QVD is just too large here are some script patterns for segmenting your QVD. https://qlikviewcookbook.com/2022/03/how-to-segment-qvd-files/

-Rob

ka51
Partner - Contributor
Partner - Contributor
Author

Sure, will have a look. Also will try to paste the code here.

RafaelBarrios
Partner - Specialist
Partner - Specialist

Hi @ka51 

i would suggest using partial reload instead of incremental

In a standard reload, the app first flushes all data and starts as if it had no data, while in a partial reload keeps the data and only executes specific parts of the script. This way you dont need to load old qvd, just load new data and "add" or "merge".

https://help.qlik.com/en-US/sense/November2025/Subsystems/Hub/Content/Sense_Hub/Scripting/ScriptPref...

here an example to  only "add" data to an existing table

Tab1:
LOAD Name, Number FROM Persons.csv;
Add Only LOAD Name, Number FROM newPersons.csv;

on your full load you will get all records in Persons.csv, and the partial reload will concatenate NewPersons.csv to existing table

 

Here an example to merge

Set DateFormat='D/M/YYYY';
Persons:
load * inline [
Name, Number
Jake, 3
Jill, 2
Steven, 3
];

Merge only (ChangeDate, LastChangeDate) on Name Concatenate(Persons)
LOAD * inline [
Operation, ChangeDate,   Name,     Number
Insert,    1/1/2021,     Mary,     4
Delete,    1/1/2021,     Steven, 
Update,    2/1/2021,     Jake,     5
];

 

After standard reload you will get

RafaelBarrios_0-1777713461831.png


After partial reload you will get

RafaelBarrios_1-1777713521469.png

 

good thing about merge is that it will take care of NEW, UPDATE and DELETE

also, if your standard reload takes 20 minutes, you could get several partial reloads during the days with less than a minute each.


hope this helps.
Best,

 

help users find answers! Don't forget to mark a solution that worked for you & to smash the like button! 

 

ka51
Partner - Contributor
Partner - Contributor
Author

Ohh this is great ! Even i was looking for partial reload . But is there any drawbacks of partial reload?

Also is it advisable to go for Add load instead of Merge. I can see the output is different in case of standard reload and partial reload. 

RafaelBarrios
Partner - Specialist
Partner - Specialist

Hi @ka51 

drawbacks ? i dont see any, im using it for "near real time" application and works perfect, but it will depend on what are you trying to achive.

if you are only adding new record and not updating or deleting, you can go with "add" statement

if you are going to receive updates or deletion, i would suggest use to use "merge"

note that:

"add load * from...." will work on standard and partial reload
"add only load * from...." will work only when partial reload happens

 

now the difference in the output is because of what i just commented

lets say you trigger a full standard reload in the morning, then only this part will be excute, while the task will ignore "merge only" because of the "only" statement

Set DateFormat='D/M/YYYY';
Persons:
load * inline [
Name, Number
Jake, 3
Jill, 2
Steven, 3
];

 

RafaelBarrios_0-1777734652677.png

 

now, the rest of day you can trigger partial reload to grab the data that changes during the day, partial reload will ignore all load statements that doesnt has "add|replace|merge".

in this example, it will create a new record "Mary", update "Jake" record from 3 to 4, and delete "Steven" record.

Merge only (ChangeDate, LastChangeDate) on Name Concatenate(Persons)
LOAD * inline [
Operation, ChangeDate,   Name,     Number
Insert,    1/1/2021,     Mary,     4
Delete,    1/1/2021,     Steven, 
Update,    2/1/2021,     Jake,     5
];

 

RafaelBarrios_1-1777734691602.png

 

I hope I've explained myself well.
But if you have any questions, feel free to write.

 

Best regards,