Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

when processing the qvd file the memory is full

I have a file of 120 GB and it is increased daily, when I concatenate the new with the already stored, all the memory is filled. The code is the following:


MaxId:

LOAD max(Id) as MaxId

FROM [Files1\file1.qvd] (qvd);

LET vMaxId = PEEK('MaxId');

DROP TABLE MaxId;

D1:  LOAD *;

SELECT * FROM TABLA  with

(nolock) where Id>$(MaxId);

Concatenate(D1)

LOAD *

FROM [Files1\file1.qvd] (qvd)    

where  not exists (Id);

STORE D1 into 'Files1\file1.qvd' (qvd);

any help I apreciated

1 Solution

Accepted Solutions
marcus_sommer

You are just appending new data to the historical data and therefore it should be possible to slice the big qvd into multiple ones - maybe on YearMonth and if there is no period within the data maybe by clustering the ID's to 100k or similar.

Further I suggest to generate the MaxId after the database-load and to store this value within a variable - and by reading the fieldvalues instead of the table-records it could be further optimized, see: “Fastest” Method to Read max(field) From a QVD | Qlikview Cookbook.

Beside this you should check if the qvd-data itself could be optimized. This means to remove not used fields - you are loading the data with a wildcard - are you sure that you really needs all fields?

Further the reducing of the max. number of distinct field-values could decrease the needed space significantely especially by timestamps, see: The Importance Of Being Distinct. Another big point is the removing from multiple formattings of a field - a pure number needs 8 byte and if all fieldvalues have the same formatting it remained by 8 byte because the formatting is then stored within the meta-data - but if there is more than a single format than the formatting will be stored for each field-value and this could be a lot more (by timestamps you hit 40 to 50 bytes easily).

- Marcus

View solution in original post

1 Reply
marcus_sommer

You are just appending new data to the historical data and therefore it should be possible to slice the big qvd into multiple ones - maybe on YearMonth and if there is no period within the data maybe by clustering the ID's to 100k or similar.

Further I suggest to generate the MaxId after the database-load and to store this value within a variable - and by reading the fieldvalues instead of the table-records it could be further optimized, see: “Fastest” Method to Read max(field) From a QVD | Qlikview Cookbook.

Beside this you should check if the qvd-data itself could be optimized. This means to remove not used fields - you are loading the data with a wildcard - are you sure that you really needs all fields?

Further the reducing of the max. number of distinct field-values could decrease the needed space significantely especially by timestamps, see: The Importance Of Being Distinct. Another big point is the removing from multiple formattings of a field - a pure number needs 8 byte and if all fieldvalues have the same formatting it remained by 8 byte because the formatting is then stored within the meta-data - but if there is more than a single format than the formatting will be stored for each field-value and this could be a lot more (by timestamps you hit 40 to 50 bytes easily).

- Marcus