4 hrs is odd for refreshing.qvw.
1. How many QVD files are using in your app & size of each file ?
2. Do you Qlikmart in between QVD Source system and Application and if not then are you building Dashboard application by using QVD files or any other source data like excel/Database?
Can you build Qlikmart with your all input qvd's prior to Application and then do Binayload into your Dashboard application(Binary load is quick and hopefully it will reduce )
you could set the task properties in the management console to perform a partial reload instead of a full reload of the visualization qvw.
Additionally you would have to edit your visualization load script to implement the logic for the partial reload (usage of "add" prefix in load statements).
hope this helps
The most critical thing that you need to get right to ensure your front end document loads quickly is ensuring all QVD loads are optimized.
Loads will be optimized if you load directly from the QVD without changing the structure or content of the data (eg. adding columns or running functions over fields). You will also need to ensure that the only WHERE statement you have is a single WHERE EXISTS. Things like JOINS and RESIDENT LOAD will not be optimised either.
Once you have all your QVD loads optimised if things are still slow then you may want to look at optimizing the size of your QVD files. Dropping un-used columns and reducing the granularity of the data (by rounding numbers to a smaller number of decimal places, for instance) will all help.
I have done a number of blog posts on optimising performance, I shall post them here but they may take a while to surface due to moderation.
If you Google "qlikvew optimised qvd loads" you should find the most relevant article.
Hope that helps,
Articles that may help with optimising your re-load routines:
One further thought on this, could you perhaps aggregate your old QVD's so they are stored to so many rows, you could do this in the following way - using a very simple data model as an example:
0 as RowID,
sum(Value) as Value
FROM DetailedQVD_200801.qvd (qvd)
GROUP BY Date
STORE Aggregated INTO AggregatedQVD_200801.qvd (qvd);
In this example where you may have had thousands of individual transaction ID's in your old QVDs this would then get aggregated to 1 row per day in the new QVD. You could then blend detailed QVD's for recent months but have more aggregated QVD's for older months - where the detail is no longer important. You may chose to drop narrative details from old rows - whilst keeping it for later months - for example.
Picking the fields you group on and which ones you sum is the key to getting this working well.
If the optimized QVD load works for you though this approach of ditching old data may not be required.
Thanks all of you guys for your answers.
I'm using only Qlikview to build qvds and dashboards. My qvds are "partitioned" by month, each one of them is almost 1gb size. Then, I upload all of them to my dashboard and there is where my biggest problem is. Creating Qvds takes like 15 minutes per month, per day it is like 1 minute. But uploading those to the Dashboard is what kills my process.
I will take a look to the Partial Load, this is totally new for me but sounds like a great way to upload the data, specially new one. I will take a look at this
I have used a lot of Resident, I think I might need to optimize my script, but combined with the other solutions I might get a right path on this.
I will follow your example and let you know.
Thanks to all.
From what you say regarding resident it may be that you could use another layer of QVD production. When you are doing resident loads and joins across all of your historical files they will take a long time to do the join - as a record from this month will be trying to match with records from last October.
If you process each of your existing QVD's into the correct format for loading into the front end in another QVD generator (or attached to the end of your current one) then you will be processing one month at a time and all functions will run much quicker. The goal is then to be able to do an optimised load in the front end.
To see what difference a single optimised load will do for you try this:
1 as NonOptimized,
FROM *.qvd (qvd);
FROM *.qvd (qvd);
You should see quite a difference.