Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have a document with size of 600 MB. I am able to open this document on server but not able to open at my local machine. It say "Out of virtual memory. Allocationg 4 MB". I have enough of RAM in my system and I can even open application having 900 MB of size. I don't know what is wrong with this particular application alone.
I checked the compression ratio for this application and it is set as HIGH. Can anyone has any clue what is happening? How can I avoid this?
Regards,
Sharma
Sharma,
You need to point expressions which have aggr,nested ifs or reused parameters. You can have a look at expressions through expression overview (CTRL+ALT+E)
Aggr: Check if they are really needed. Sometime can be done at script level and some you can try using TOTAL.
Nested Ifs: These can be done at script level most of the times (unless used after aggr tables).
Resued Expressions: Some expressions like current date/ current year/ current month etc are used in almost all expressions. You can reduce the calculation of these terms by subtituting with variables.
Set with Text: If you have set where product={'product_1'} , create a key at script (using AutoNumber() ) and use it in set anlaysis as product={'1'}.
These are generic but there are more depending on context and requirements. Unless we understand the business needs its difficult to suggest.
You can look into the opening sheet properties and find out the object which consumes more memory. Give the expressions in the object and data model for us to suggest better.
Kiran.
Can high compression be the cause for this?
I reduced the compression to "None" but still I am facing the same issue. Not able to open it on local system.
Sharma,
Document size is not measure of the RAM required. Apart for the data size each object, variable consumes variable which depends on the complexity of computation. for example, AGGR, nested IF consume huge memory.
If you are able to open the document in server, check out the memory of each object @SheetProperties->Object Tab->memory of object.
As a work around for ease of development, I usually do a data reduction by section access. As data size also gets down the object memory, you can open it. When you are able to open check the objects and find out which are increasing the memory and optimized the code.
Few tips are to avoid AGGR, nested ifs, set analysis on strings and check if you can make redundant calculation thru variables. Like: Variable instead of max(year) for recent year (which might be most used).
Hope this helps you,
Kiran.
Thanks Kiran.This was very helpful in deed.But I was not able to figure it out
Any other suitable technique which can be helpful ?
Regards,
Sharma
Hi Sharma,
Have you try to open document without data? Let try it then reload data again. Run (win key + R) sample below:
[PathExc]\qv.exe [PathFile]\Test.qvw /nodata
** [PathExc] for me is C:\Program Files (x86)\QlikView\qv.exe
Regards,
Sokkorn
Hi Sokkorn,
It is not like I am not able to open the file at all. I can open the file in Server, but while opening it from my local system, it is showing me virtual memory error. what can I do for this. Even If I copy the file from server, I am not able to open it.
better option for you is to optimize you application on server ,if possible.
Sounds goo but how do i know that where and all optimization is needed. I mean which and all expressions needs to be checked?
i will give you few points
check your data modeling
try to write expession inside script if possible.