Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
We are developing Data Model in Qlikview. Also we are achieving Star Schema Model.
We have around 45 Million Records in Fact Table which results into 725 MB QVD, as a result we are facing performance issue.
When we filter data bring record set to 4 Million and qvd size to 250 MB, things works better for us.
Is there anyway to deal with this scenario.
-- Shree Angane
Hi Shree,
yes and no - there is probably no way to solve your performance issue - IT-specific actions aside - than to just get your fact_table smaller which can basically only be done by either reducing the nr. of fields you are loading (find out about unused fields using the DocumentAnalyzer (a qvw available here in the Community))
or by reducing the timeline (e.g. loading data for only the current year)
HTH
Best regards,
DataNibbler
Hi,
I think data upto 4 Million and qvd size to 250 MB is not create any big issue.
Just check that you are not using performance affecting function like if()
Avoid IF()
Regards,
Thanks for your Reply.
Surely I will go through "DocumentAnalyzer", need to figure out some way to it.
Best Regards,
Shree Angane
Hi,
Instead of reducing rows, try reducing columns. Check which columns can be disregarded, because besides the number of records, another thing that affects the size and performance of your data are the columns.
Also if you're experiencing perfomance issues after doing some optimizations (working in a 3-4 data tier, using optimized qvds, avoid using nested ifs in expressions, avoiding using calculated dimensions and instead precalculating as much as possible in script, etc) then next solution would be to increase resources in your server (adding more ram for example),
regards
Hello,
There are a few things you might want to check in order to help you optimize your document:
Regards,
Philippe