Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I'm trying to load multiple (5) QVD files in my script. Each QVD has 15 000 000 to 20 000 000 lines. Adter a while I'll get a out of memory exception. With my approach everything is loaded into the TRANSACTIONS table. Can it be done in any other way?
Directory;
TRANSACTIONS:
LOAD
Date,
Hour,
CUSTOMERID,
CVOID,
LINES,
SESSIONID,
SUPPLIERID,
SYSTEMUSED
FROM TRANSACTIONS_20*.QVD (qvd);
Thanks!
/magnus
I believe that 32bit W2003 Server will only allow a task up to 2Gb of memory. There is a trick to force it to use more - a switch /3 on loading server I seem to remember will give server up to 3Gb - but this isnt my area really.
Regards,
Gordon
5 files, each will 15 to 20 millions of records is quite much 🙂
How much RAM do you have?
Perhaps if the QVDs were loaded in distinct LOADs the problem would resolve itself? Try loading each in a seperate LOAD statement (they will append if the structures are the same of course).
If that works, you might want to create a loop containing a LOAD something like
FOR
each vFile in filelist ...Regards,
Gordon
Are you getting optimized loads? That is, do you see the "optimized" message in the load progress window?
32bit or 64bit?
-Rob
I'll try that and see what comes out.
Thanks!
/magnus
For the moment it's a 32-bit Windows 2003 Server with only 4GB memory.
We will shortly have a 64-bit server with 10GB of memory and perhaps the problem disappears for the moment. However, the data we like to dipslay and analyze with QlikView is growing rapidly so I need to find a solution to this anyway.
For sure, we can buy more hardware but it doesn't solve the core issue.
/magnus
I believe that 32bit W2003 Server will only allow a task up to 2Gb of memory. There is a trick to force it to use more - a switch /3 on loading server I seem to remember will give server up to 3Gb - but this isnt my area really.
Regards,
Gordon