I have a document that contains 24 input fields. I've been working with it for a while and it recently started giving me an OUT OF VIRTUAL AND/OR LOGICAL MEMORY error when I try to save it. It's not a large report - less than 3MB and there's no memory issues when loading the data. It gets the error whether I try to save after reloading data or if I just open the document and then try to save it.
It is related to the input fields somehow since it works ok if I comment out 6 of the InputField statements in the loadscript.
Has anyone seen that behavior before? Do you know of any way to avoid it?
Thanks for any response.
Are the input fields in the table structure? sometimes I have had issues when QV does the association and attaches input fields to every table, often resulting in loop issues etc also. You may have tried this already, but if you have them in their own table and place a key field in with them, then sort out all the rest of your loads, it might help in the memory issue.
I have often found that sequence to the load approach becomes a bit longer, but allows a tidier finish.?
I am getting the same trouble too - much bigger file though.
I load first a list of about 3 million row numbers - which for some reason makes a 95mb qvd
I then filter a larger qvd - again of about 3 million rows, but 250mb - against this. Memory usage goes up to 1.37gb
When this is done and the row numbers table is dropped, memory usage falls to 775mb
If I then step to 'save' the memory usage rises rapidly to 1.01gb and I then get OUT OF VIRTUAL AND/OR LOGICAL MEMORY wanting 162mb
There seems to be plenty of memory around, the same file saved fine in the program that created it.
your issue is different from the Input Fields issues presented earlier in the thread...
First of all, a 95MB QVD size for 3 million rows is certainly not typical. I'd recommend analyzing the data in the QVD for unreasonably long or unnecessary detailed fields. For example:
- full timestamps instead of dates (they take much more space than necessary, and the number of distinct values grows a lot)
- amounts with unnecessary high number of decimal digits (some systems are returning 20 or more decimals, while only the first few make any difference)
- any other fields with a lot of distinct values, that might not be needed for analysis.
The best way to find the biggest memory consumers is using the application "QlikView Analyzer". It used to be shipped as a sample with QlikView 8.5. You will see what fields need to be taken care of.
The second problem is "Out of Virtual and/or physical memory" at the very end... While there might be many various reasons for that, the two that I find more common are:
- building synthetic keys, or
- performing a join when common key field is missing or misspelled (that causes a Cartesian multiplication).
not sure if this is your case...
Some food for thought there, but the first file - the 95mb one - has only one field, and that is a field filled with the value of Recno() in the 350mb table - broadly:
RecNo() as EXT_Record,
EX_Exam_Unique_ID as EXT_Exam_Unique_ID
from [EX_Exams_GCSE_2009_2.qvd] (qvd)
EX_KS4_Result_Included_in_Calculations='1' and exists(JOIN_PU_EX) and EX_DISC3='0' and EX_DISC3B='0' and EX_DISC1='0' and ( not (EX_AMDFLAG='D' or EX_AMDFLAG='W' or EX_AMDFLAG='CL')) and (not (EX_Qualification='Key Stage 2' or EX_Qualification='Key Stage 3')) ;
drop table PUT;
EXT_Record as EXTT_Record
where not exists(EXTT_Record,EXT_Record);
which makes 95mb very odd - you could store the data in a text file in 20mb.
the 350mb qvd is then loaded, dropping or renaming a few fields, "where exists(EXTT_Record,RecNo())", and EXTT is dropped - QV down to about 350mb in memory at this point, total useage 800mb of 2gb.
All fine so far
Then I save the 350mb file into a qvd. Almost immediately I get hit with the memory error
Like you I am well used to triggering this error when I have overcomplicated things, or misprogrammed a join - but this is really so simple
Not as simple as my original program, which had none of the EXTT stuff and just loaded the 350mb file with the filter that applies to EXT above - but that failed on save too, and EXT arose in an effort to split the process down into smaller steps.
yes, very strange indeed... One suggestion - send this scenario to QlikTech support. If it's a bug, it needs to be resolved. Another suggestion - transition to a 64 bit environment. It's so cheap today that it's not worth your time to try and squeeze more throughput from 2GB of RAM ...
I have the same trouble too!
my document has only 1 milion of records, and only one inputField, but its size is 18MB!!!
if I try to reload little bit more data (1,2 milion), the reload process ends without warning, but when I try to save, I get "OUT OF VIRTUAL AND/OR LOGICAL MEMORY ".
I think the problem is the InputField!
I know the InputField declaration creates an hidden variable for all original field's values, where QlikView stores the new manually inserted values. Probably these variables are the reasons of the file's big dimension and then of the "Out Of Memory" warning, but there are no issues about a way to "release" the values of the variables and flush unnecessary memory usage...
Obviusly I try to "Reset Original Values" and to use a macro (according APIguide), but both of my attempts are useless!
Anyone knows more about the "secrets of InputFields"?