Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I'm wondering how do we benchmark the performance of qvw?
I have 4 millions rows, in 2 flat tables with sequential numbers as keys to link the 2 tables.
one for fact, one for dimensions.
When loading the chart with set analysis, i still see the hour glass. i wanted to remove the hour glass totally.
Does this mean going on a more powerful machine would help?
Could someone share your experience?
Thanks.
Possibly.
Just to make sure, you do see that the two tables are linked in the table viewer? If you see two separate tables without a line connecting them then the tables are not associated. In that case make sure the key fields that should link the tables get the same field name.
Befofre thinking about a more powerful machine you should rather optimize your load-script and calculate there most of your expressions or at least pre-calculated some expressions-parts and re-structured your datamodel. Helpful could be to join and/or concatenate your tables in on big fact-table - calculations within a table is most often faster as over several tables. Sometimes you must decide to optimize your script-speed or your GUI-speed ...
- Marcus
4 million rows is a relatively small QV app and your data model sounds reasonable. I suspect your expressions(s) can be optimized. Can you post your expressions?
Rob
Hi,
A more powerful machine will always help things.
Optimizing your expressions is often the best way to boost performance, what is the calculation in the chart that takes a long time?
You will get better performance at the front end if you have just a single table. You could look at joining your Dimensions table to your Fact table to achieve this.
There are lots of examples of ways you can optimize your documents in blogs and articles on-line. My own blog has a number of such articles.
Regards,
Steve
Thanks Gentlemen for your comments.
The link between 2 tables are linked correctly.
40 columns for dimensions.
180 columns for facts.
Frequently used fields like months, years, are already joined into the fact tables.
We do a lot of rolling 12 months, and we use SET.
Is there other way to get rolling 12 months other than below?
(sum({<YEAR=,
TIME_MONTH_SHORT_LB=,
TIME_YY_MM=,
MONTH_SEQ_PIV={'$(=MONTH_SEQ_CURRENT-13)'}>}ORDER_NUM)-
SUM({<YEAR=,
TIME_MONTH_SHORT_LB=,
TIME_YY_MM=,
OELT_ORDER_DELAY_REASON_CS={'Yes'},
MONTH_SEQ_PIV={'$(=MONTH_SEQ_CURRENT-13)'} >}LATE_ORDER))/
sum({<YEAR=,
TIME_MONTH_SHORT_LB=,
TIME_YY_MM=,
MONTH_SEQ_PIV={'$(=MONTH_SEQ_CURRENT-13)'}>}ORDER_NUM)
The whole qvw takes 18 secs to open, STAR schema.
After turning into 1 fact, 1 dimension, 18 secs. is this long for 4million? or could be better?
The hour glass is much shorter when loading charts (around 1 sec),
but i am trying to totally eliminate it, but i'm not sure if this is realistic?
Do you often see hour glass in your qvw?
Some questions...
Thanks...
Gysbert published an excellent paper on rolling period analysis. I can't find it but Gysbert is on this thread so he can post a link.
Let me address just one of your questions, opening time. Opening time is made up of four components.
1. IO -- time to read from Disk.
2. De-compression. CPU time to decompress if the QVW was stored compressed (default).
3. Section Access dynamic data reduction.
4. Initial chart calculation.
As I said before, 4M is not a very big QV app, but you do have a significant number of columns. My first step would be to determine if you need all those columns. Use Document Analyzer (Downloads - Rob Wunderlich Qlikview Consulting) to identify unused fields and drop them if possible.
Some QVWs take a long time to de-compress. You may want to try it with compression=none (Document Properties, General pane) to see if that speeds things up.
In the server environment, you can nuetrialze the effects of steps 1 & 2 by using document Pre-Load option.
Section Access reduction can take time. In my experience, if selecting a large amount of values, the time can be reduced by selecting integer values (eg autonumber) vs string values.
I would work on those items first and then use the chart calc time data in Sheet Properties to determine how much time is attributable to chart calc.
-Rob
You can find the document Rob mentioned here: Calculating rolling n-period totals, averages or other aggregations
@Rob: It looks like you bookmark the document, so it should be in your bookmark list.
Qlik Ranger wrote:
Thanks Gentlemen for your comments.
The link between 2 tables are linked correctly.
40 columns for dimensions.
180 columns for facts.
Frequently used fields like months, years, are already joined into the fact tables.
Ranger, so do you have 40 columns you are joining your both tables on? Or is it just one common column in both tables and the rest 39 are in the dimension table? This is important for Qlikview performance. In other words do you have any synthetic keys in your Qlikview schema?
Thanks Rob for your advice.
Klicki, we can imagine in the schema of having 2 boxes and one line.
No sync, no circular.
I also saw your question in another post why not joining them in single table.
it's because i have a lot of columns, and it might take up a lot of memory.
thats why i'm keep one fact one dimension.
But i'm really interested to know, in general, how long does QV desktop to load a 2tb line of qvw or 4millions rows of qvw? I need something as a benchmark. i understand it varies from the data model, the expression. etc. but i just wanted to have a general idea as i dont have a lot of big qvw.
thanks.