Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
morenoju
Partner - Specialist
Partner - Specialist

Alternatives to Incremental Load when having to add a few records

Hi folks,

For small apps, incremental load works great quickly loading the old data from qvd files and concatenating any new data. However, when the old data is huge and the reload of a few records has to be done very frequently, this approach often ends up with the engine running out of time between reloads.

What alternatives do we have to Incremental Load when we only need to have a few records to the current dataset?

Thanks.

Labels (3)
1 Reply
marcus_sommer

Usually is the optimized load of qvd's very fast even by really large datasets. Therefore is the qvd-load optimized and is executed at first - meaning before adding the new data? Also a look for the storage/network of the data is useful because the biggest impact of the qvd's got lost if the data-transfer is rather slow. Further you may relinquish to store an updated qvd again and again and adding by each iteration a slightly bigger part of new data (the writing of the data may need significantly more time as reading them). Maybe there is some space for optimization ...

If the time-frame is really too short for loading (and updating) the qvd you may switch to a bit different approach in which you do each time a binary load of a qvw which is based on the historical qvd data and adding then the new records after the binary load.

- Marcus