I am facing lots of issue's while developing my project. I am working on a daily portfolio analysis of my product which include's psychographical, demographical and transactional data.
I have divide my development into various stages because the data i am getting is in a very raw form.
In first step I parse all the raw data and store into a QVD files seperatelly for Financial & Demographical and Seprate QVD for Transaction. whereas for Financial there is daily QVD file because of more than 400,000 records on daily basis. and there is one cosolidate transaction QVD generated / updated on daily basis. so in the month end i would have 31 or 32 file as per the no of days in months.
Second step is to update the final project file QVW where I read all the data from QVD's. currently i am in the stage of developing my dashboard, analysis sheets but i couldn't able to update my object's and grid's.
I stuck on the development of my dashboard. I put numerous object about 9 or 10 having set analysis, formulas, aggregation etc etc... but now whenever i try to move any object the system starts processing and got hanged for a long. From the task manager i have found that when i move object CPU usage is increases and only 1 GB ram is utilized.
My system configuration is attached for your reference
if only 1.5 GB is being used, memory is not your problem (at the moment, even though you are getting pretty close). Try looking into the following:
- Open Document Properties, Sheets, and look at the sheet objects list in the bottom. Notice the objects that take most time / memory to recalculate and try to analyze what's especially heavy with those objects.
- Find "QlikView Optimizer"- it should be available in "Share QlikViews", and load your memory statistics there. Analyze your data and screen objects for the heaviest performance hits.
- One problem that I learned the hard way is that, when dealing with large data sets, QlikView seems to perform better with fewer objects on the screen (even if each object has a higher number of dimensions and expressions) than with lots of small objects (like for example numerous text objects and small sparklines on the famous Financial Controlling Dashboard). If you can convert a number of smaller objects into a single bigger object - you might gain in performance.
- If you have any macros, try to avoid those at all cost. They are hitting performance very hard on a large data set
- Watch for heavy expressions, especially involving IF statements. No matter what people might say on the forum, IF statements are prohibitive for large data sets.
I will try the QlikView Optimizer, attached is my document properties FYR and I couldn't find how to reduce the calculate time...
whereas I am not using any macro. The object which i have finalize is as per my requirement and i cann't reduce them. if you multiply the 400,000 records into days 31 days and also transcational data is thier so the amount of data is huge here. I know the data processing will take time but it's hanging the system. Might be your qlikview optimzer help in this case. I will get back to you when i have done with the use of it.