Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
shumailh
Creator III
Creator III

Insert Data to QVD files

I am using the below code to insert data from 30 QVD files to a new final QVD file whereas each qvd file contains 400,000 records. when i am trying to execute this code my sytem's virtual memory size has exceeded the available limit at the file number 14.

Can I update the QVD file by inserting new records with in a loop and releasing the memory at each step of the loop?

let path= 'D:\Test\QVDs' ;
let Outputpath= 'D:\Test\QVDs' ;

For

For each File in filelist ('$(path)AH*.QVD')
AH:
LOAD * FROM [$(File)] (qvd);
Next File ;

store AH into $(Outputpath)Final_AH.qvd;
drop table AH;

1 Solution

Accepted Solutions
Oleg_Troyansky
Partner Ambassador/MVP
Partner Ambassador/MVP

Well, first of all, your hardware is pretty modest, in relation to the amount of data that you mentioned earlier... I'd defeinitely look into a possibility of an upgrade - some 64bit system with more RAM. Those are cheap today...

Other than that... when you read data and then write it into a QVD, the system typically need twice as much memory for the short period of time when the data is being written. You might find out that you can still load your data from multiple files into a QVW and finish your load script successfully without writing it back into a combined large file.If that's the case - then just keep your daily files and load them all into your final document.

If you can't even load all your data - then your only choice is to upgrade your hardware and use 64-bit version. Or, find another way to reduce the amount of data that you load...

Oleg

View solution in original post

4 Replies
Oleg_Troyansky
Partner Ambassador/MVP
Partner Ambassador/MVP

No, QVD files cannot be appended, it's always a full re-write. I'm guessing that you are using a 32-bit machine... You might want to run the same script on a 64-bit machine, if you have access to one...

Out of curiosity - what's so bad about keeping your data in 30 files and loading them as needed? Why necessarily write them into one "final" file?

You know, of course, that in your QVW you can simply load them all based on the wildcard:

load * from $(path)AH*.QVD (QVD);

cheers,

Oleg

shumailh
Creator III
Creator III
Author

Basically we daily generate a qvd file and in the end we consolidate it... (i.e. Same Problem of Updation)

Is there any alternate? OR any other suggestion on the same hardware

FYI.... My system RAM is 3 GB, Processor : Intel Core 2 Duo @ 2.83 GHz.

Oleg_Troyansky
Partner Ambassador/MVP
Partner Ambassador/MVP

Well, first of all, your hardware is pretty modest, in relation to the amount of data that you mentioned earlier... I'd defeinitely look into a possibility of an upgrade - some 64bit system with more RAM. Those are cheap today...

Other than that... when you read data and then write it into a QVD, the system typically need twice as much memory for the short period of time when the data is being written. You might find out that you can still load your data from multiple files into a QVW and finish your load script successfully without writing it back into a combined large file.If that's the case - then just keep your daily files and load them all into your final document.

If you can't even load all your data - then your only choice is to upgrade your hardware and use 64-bit version. Or, find another way to reduce the amount of data that you load...

Oleg

shumailh
Creator III
Creator III
Author

Thanks for the reply Oleg!

I think the reading from the daily file into one QVW is the right approach rather to consolidate into single qvd on daily basis. and I agree the upgradtion is required on my machine. Thanks