Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Loading of qvd files of big data size (>50 GB) into .qvw files

We have a main qvd file of size >50GB, and we read it in a qvw file

At a next point in time, we want to read in the qvw file the new data points, without reading again the big, main qvd file.

With an example, in the script we want Table1(Example.xlsx) to be constantly in memory and to load only Table2(New data for Example file.xlsx) in the test.qvw.

We know that incremental load works only with the partial reload but this is not done automatically from the server.

We tried the incremental load‌ logic, but in all occasions we need to read and write the main qvd >50GB.

Is there a way to overcome this logic? if yes, which is it?

If there is a way, can you please provide the scripting in the attached qvw?

2 Replies
Not applicable
Author

One option is split the QVD's into small sizes like Monthly or weekly QVD's.

marcus_sommer

Have you really tried to implement a partial load, see here: Partial Reload Easy and Simple and within the manual. Also the suggestion from dathu.qv is useful to make incremental and/or partial-loadings easier.

But at first I would look if the amount of data could be reduced. Are really all fields needed? Are there fields with a high cardinality which meant with many unique values like record-id's (will be only needed for developing - not in production) or timestamps or similar - see here what is meant: The Importance Of Being Distinct.

- Marcus