Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi QV gurus,
I have a challenging problem which i desribe here and in the attached example document.
The goal is to calculate the solvability KPI for the dimensions company and period.
Solvability is defined as follows:
Solvability =
Equity / (Equity+Debt)
The problem i have lies in the denominator part of the expression. Some accountgroups can only be included in the denominator if the sum of the underlying accounts of the accountgroup sum together bigger then zero.
In the expression of this simple example it is not a big deal. However, in reality i have 25 accountgroups. This makes my expression too long and complex which causes poor performance.
Question:
1) How to simplify the expression?
(In my opinion is calculation in the script no option because i need the flexibility of the company and period dimensions. I.e: an account group can become less then zero if a certain company/period dimension is selected)
Hi,
you can use the function AGGR in your expression:
= NUM(
SUM( {$<AccountGroup={1}>} Amount)
/
SUM(IF(AGGR(SUM(Amount), AccountGroup) > 0, AGGR(SUM(Amount), AccountGroup)))
, '##0,0%')
You don't need to insert 25 row for the AccountGroup.
Hi,
you can use the function AGGR in your expression:
= NUM(
SUM( {$<AccountGroup={1}>} Amount)
/
SUM(IF(AGGR(SUM(Amount), AccountGroup) > 0, AGGR(SUM(Amount), AccountGroup)))
, '##0,0%')
You don't need to insert 25 row for the AccountGroup.
Thank you very much Eduardo!
pls do complex calculation at script level. It improve performance at design level.
Sunil Jain wrote:pls do complex calculation at script level. It improve performance at design level.
Actually, I would strongly recommend AGAINST doing this calculation in the script except as a last resort if you have serious, serious performance problems that you simply cannot fix in any other way. I would recommend that data always be stored at as low a level as possible, and that calculations should only be done in the script if they can be done at that lowest level. This calculation is done at a group level, so I feel it would be inappropriate to do it in the script.
To pre-calculate this data is to "lock" yourself into a specific way of looking at your data, a specific hierarchy, similar to building data cubes. Today, we want a chart that aggregates by account group, company and period. So we build a data structure specific to that chart. Tomorrow we want a new chart that aggregates by document and year. So we build a second data structure specific to that new chart. After several months of adding new charts, your data model is in complete disarray, a monster of spaghetti that continually creates unintended connections that you have to stamp out every time you want to do something as simple as adding a new chart. Exaggeration? Sure. But being able to create charts on the fly WITHOUT creating specific data structures for each chart is a big part of WHY you use QlikView, and WHY it uses an in-memory data model.
Don't just sacrifice one of the main advantages of QlikView IN CASE you might someday have performance problems. QlikView makes on-the-fly aggregation not only possible, but very quick in most cases. Take advantage of that. Wait for performance problems to actually occur. Then try to fix the problems in some other way. Only do aggregations in the script as a last resort.
Edit: I forgot one of the biggest reasons to avoid aggregations in your script - those aggregations will not always be sensitive to user selections, since the users can select some but not all of the records that go into a particular aggregation. If you were using set analysis to IGNORE all selections, then this wouldn't apply to you, but typically you want your charts to respond correctly to user selections.