Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi, We are ready to build a sales performance analysis. But I have an issue regarding with the script. Hope you can help me to solve it.
Background:
We have an order detail table (6 Million rows) .
In business:
We need the period and each month order value. like I need total July order value and the total order value from January to July. So when I load order table.
Solution 1.If I load all the order detail table and mapping a master date table, and use sum and rangesum function to calculate them in set expression. but I heard that rangesum is quite slow in set expression.
Solution 2.If I calculate in script, how to build the modeling? Creating two tables, for one, it is the detail and other is period ?
To create such views is no rangesum() of multiple period-results necessary else a sum() with a more or less advanced set analysis, maybe like:
sum({< Month = {"<=$(=month(addmonths(today(), -1)))"}>} Value)
Depending on the real expressions/object-dimension you may need also to set the Year and/or adjusting the set to grab the current/previous month/week or whatever.
It's not really complicated. But it could be of course easier if these things YTD, PYTD, MTD and so on are flagged with 0/1 within the master-calendar. And then the expressions may look like:
sum({< YTD = {1}>} Value) or also sum(Value) * YTD
Here are many useful information about How to use - Master-Calendar and Date-Values - Qlik Community - 1495741.
Such an approach is very fast and will work even with much larger data-sets and low hardware-resources. By only 6 millions of records a normal office laptop should be sufficient. Therefore I wouldn't go with an aggregation within the script and if I would rather not tend to create an extra fact-table even if they is handled as a dimension-tables else adding them per concatenate to origin fact-table to create a fact-table with a mixed granularity.
- Marcus
Hi, I would choose solution 2. You just load your detail table, additionally can add 'Period' field (MonthEnd([your_date_field])) which can be used as key. From that table you can aggregate your sales. So script could look like this:
detail_table:
LOAD
*,
MonthEnd([your_date_field]) as Period
FROM ...;
aggregated_table:
LOAD
Period
Sum([your_sales_field]) as AggrSales
RESIDENT detail_table
GROUP BY Period;
Also in aggregated_table you can implement rangesum for too. Unforgettably, this solution will not very 'friendly' if you will need to filter something, like customer, shop etc. as per period you only have aggregated sales. So I would try to use set analysis to sum everything from detail table first (without rangesum) to look is it really to hard for Qlik to calculate such data. For this something like this could be used:
SUM({$<[your_date_field]= {'>=$(=YearStart (Today()))<=$(=MonthEnd( max([your_date_field]) ))'}>}Sales)
To create such views is no rangesum() of multiple period-results necessary else a sum() with a more or less advanced set analysis, maybe like:
sum({< Month = {"<=$(=month(addmonths(today(), -1)))"}>} Value)
Depending on the real expressions/object-dimension you may need also to set the Year and/or adjusting the set to grab the current/previous month/week or whatever.
It's not really complicated. But it could be of course easier if these things YTD, PYTD, MTD and so on are flagged with 0/1 within the master-calendar. And then the expressions may look like:
sum({< YTD = {1}>} Value) or also sum(Value) * YTD
Here are many useful information about How to use - Master-Calendar and Date-Values - Qlik Community - 1495741.
Such an approach is very fast and will work even with much larger data-sets and low hardware-resources. By only 6 millions of records a normal office laptop should be sufficient. Therefore I wouldn't go with an aggregation within the script and if I would rather not tend to create an extra fact-table even if they is handled as a dimension-tables else adding them per concatenate to origin fact-table to create a fact-table with a mixed granularity.
- Marcus
I really appreciated for your answer. It enlightens me. I have not thought of using this method.