Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Qlik Sense Gurus,
I'm seeking advice on the following use case. Each row of my data has two dates - start and end, - and a value. I need to calculate a cumulative sum of the values from the rows with start dates up to each month in 2019 and then limit the resulting sums to only include the values from the rows whose end dates are less than or equal to the last day of each month.
Here's a quick example:
Start | End | Value |
2/1/2019 | 12/1/2019 | 25 |
2/1/2019 | NULL | 50 |
2/1/2019 | 2/1/2019 | 100 |
12/1/2018 | 3/1/2019 | 200 |
12/1/2018 | 1/1/2019 | 300 |
11/1/2018 | 1/1/2019 | 400 |
10/1/2018 | 1/1/2019 | 500 |
Based on this data excerpt, here are the sums I'm looking to get:
Jan 2019 - Sum all except the first four rows (1st-3rd: start date falls outside of the current month, 4th: end date is greater than the end of the current month)
Feb 2019 - Sum all except the first, second and fourth rows (1st&3th: end date is greater than the end of the current month, 2nd: no end date)
Mar 2019 - Sum all except the first and second rows (1st: end date is greater than the end of the current month, 2nd: no end date)
So far, I've come up with a formula to calculate a cumulative sum but am having a hard time figuring out how to implement a condition to limit that sum by another date. I know that set expressions are calculated for the entire set but my use case implies finding a max/last date for each month in 2019 to compare with the start dates of the rows being accumulated up to these months.
Sum( Aggr( Rangesum( Above( Sum(Value), 0, RowNo() ) ), StartMonth (NUMERIC, ASCENDING) ) )
I'm still promoting the As-Of table. The table is a cartesian join of the calendar to itself, then you cleanup the records where the calendar dates are greater than the As-Of dates. You end up with a table of As-Of dates with a record for every calendar_day that is equal to or prior to the As-Of date.
So Sum({<End=P(calendar_day)>} Value)
When you select the As-Of date Jan 2019, you should only get the last 3 rows
When you select the As-Of date Feb 2019, you should get the 3 row and the last 3 rows
When you select the As-Of date Mar 2019, you should get the last 5 rows
I could be wrong.
It sounds like the end date is really the key. I like the As-Of Table described in the link.
Hi Jwjackso,
Thaks for referring to the As-Of table. I've given that a through too but still couldn't see how I implemenat my use case with two dates, one of which is used to build a cumulative sum and the other is to limit that sum based on the max date for each period.
If you build the As-Of table off the end date, couldn't you just sum on the As-Of 2/1/2019 date and that would give you all the rows except for the second row.
Ah, I see where you are coming from. I provided a pretty limited data sample that gave a false impression of the end date being a key here. In fact, it's the start date that is always here, but the end date can be NULL (which signifies that an item is not closed yet) and can also point several months into the future. Please see the updated example for more details.
So I need a cumulative sum of all items that were started up to the current month and were ended no later than the end of the same month.
I'm still promoting the As-Of table. The table is a cartesian join of the calendar to itself, then you cleanup the records where the calendar dates are greater than the As-Of dates. You end up with a table of As-Of dates with a record for every calendar_day that is equal to or prior to the As-Of date.
So Sum({<End=P(calendar_day)>} Value)
When you select the As-Of date Jan 2019, you should only get the last 3 rows
When you select the As-Of date Feb 2019, you should get the 3 row and the last 3 rows
When you select the As-Of date Mar 2019, you should get the last 5 rows
I could be wrong.
You're making a good point, thank you!
It turned out to be a couple more variables in the equation I'm trying to solve, but it looks like an As-Of table can be a good enough solution.