Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I nested a large calculation string into a variable to make calculations consistent across the app and am running into an issue with using the variable as a dimension.
Ex. of the calculation in the variable:
(
if((If(Sum([Points Deducted])>0,Sum([Points Deducted]),'N/A')-Sum(Total_Addressed))=Validator_Image,1,0)
+
if(wildmatch(Estimating_3rdParty_Check,Validator_3rdPartyEstimating&'*') and Estimating_3rdParty_Check <> 'Mitchell',1,0)
+
if(OneManagement_Check = Validator_Management,1,0)
+
if(ThirdPartyheck = Validator_3rdParty,1,0)
+
if(WSBR_Reporting_Check = Validator_WSBRReporting,1,0)
...
In the table, I'm attempting to count # of locations when the variable equals, >9, 10, 11, etc.
Example of an optimal table output:
COUNT OF CERTIFICATIONS
>9 | 10 | 11 | 12
Count of Locations : 4 | 15 | 12 |
You used the aggr() tag in this post but you don't show it in your example. To use this expression as a dimension, you will need to wrap it in an aggr(). Like:
-aggr(yourcalc, location)
-Rob