I am looking for ways to have my qlik app running faster when user is browsing it. The app is hosted in a Qlik server.
Currently, the data set is one big flatten table with 80 columns and transactions that is about 20GB in qlik file. The server has 64GB RAM and ~30 GB free and 24 processors. The visualization from client's browser is based on that. whenever a user is clicking on one thing, then, it takes 20 secs or more for qlikview to show the result from the associations.
How to improve the performance? Assume there is no If statements.
Is there a way to load and reload the data so data will stay in memory? There is a believe that the data didn't stick after the loading.
There is an option on QMC that you can have your app 'preloaded', i.e. it should stay in memory and should not be removed when idle. But this would not increase the performamce once the app is loaded and opened.
You can maybe have a look at the settings that allow QV to remove e.g. cached results.
But just to clarify, your data model is just one flat table, right? And how do your expressions look like, do you make extensive use from e.g. advanced aggregations?
I would suggest looking into Rob Wunderlich's document analyzer to find out possible issues in the document.
Yes, it is one flatten table. Well, could you please elaborate advanced aggregations? Will they help or not?
Just to clarify, it is slow browsing from latest IE/Chrome. If I open app from the server, it's response time is reasonable liked, a few sec. where IE takes 20 sec.
Is there a way to piece meal this large chunk of data to gradually add to the "sheets" of the app on browser? The thinking is first page of dash board will render first and then the subsequent sheet will load next and so on. Most importantly, will piece meal speed up the user experience in browser?
The advanced aggregation function aggr() will not increase the performance, it's more the other way round.
Basically you should avoid heavy calculations, but that's hard to tell without knowing your exact requirements, model and current expressions.
If the response time is much worse when using the Ajax client, could you describe how much data points need to be transmitted to the client? Are you looking at highly granular data or are you aggregating your data to low granularity?
QV should only calculate the objects that are shown, so it's a good advice to minimize / hide objects that are currently not needed or put them on another sheet.
Thanks for sharpening my thinking. You are very helpful.
The data points for the Ajax client is the entire flatten table. Rumor has the reason for the entire flatten table is the client wanted to have consistent results from the slicing and dicing across different sheets, and different apps. Yup, these brave people want this already very slow data set be binary loaded to another apps. ( haha. I got to adopt this problem child. :-)
since the idea of objects/sheet come up, would the Ajax/IE experience be faster spreading the objects to more sheets having a fewer objects/sheet ratio, say 4 charts 3 list boxes per sheet, and stay away data grid table?
I could go ahead and try it, but, the data loading and reloading of this behemoth data set push me back to the drawing board very often. Come up with a more thoughtful plan to tackle may worth it.