Discussion board where members can get started with QlikView.
I am presently developing a QV dashboard which needs to load huge amount of data.
It comes from a mysql table with around 22 columns.
Data is added on a daily basis.
Approx 400,000 lines of data per day.
I store this data in qvd files on a daily basis and the QV dashboards reloads everyday to update data.
The mysql data is approx 100 MB per day and the QVD files occupy considerably less space of 15-20 MB per day.
The dashboard contains a bunch of pivot tables having 6 expressions each and 2 to 3 dimensions.
Then there is a multibox acting as a filter tab containing 10-12 filters so that the users can do any sort of cross tabulation and also a bunch of simple barcharts.
So, i have few questions.
1. Within few months, the data to be loaded in the dashboard will be in GigaBytes. Will the performance of the dashboard be affected( at the user end)?
If there is going to be a dip in performance, how can we tackle it? ie. after a particular point of time , should i archive the old data?
2. Some pivot tables have conditional visibility . SO even tho only one table is visible, there are multiple tables there. Will this affect the performance?
3. How can i do performance testing for this dashboard?
PFA Some Guide Lines Collected from multiples and my experience.