Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have a need to show on a graph raw sales and sales adjusted for menu price changes over time. I can almost get this to work for a single location, but once I add the 2nd or more location(s) it breaks.
The final result will look something like this. it's something like showing today's sales in yesterday's dollars.
There are several challenges i need to overcome:
I have attached a table of made up data as an example. The formula I used to create the column "Initial Menu Factor" is:
aggr(min(Total <[Unit]> MENUFACTOR), Unit, Period, FY)
This almost works, but note that the amount calculated for Location 1 is 2.7731, but should be 2.7741. It calculated 2.7731 because it was the lowest amount for that Location for the time frame selected. I am sure this is a result of me using MIN in the aggr function. For the other 2 locations, the formula works only because the first date is also the lowest amount.
"Incremental Factor" is calculated as:
(Aggr(Avg(MENUFACTOR),Unit,Period,FY))/
(aggr(min(Total <[Unit]> MENUFACTOR), Unit, Period, FY))
"Menu Adjusted Sales" is calculated as:
Sum(Net_Sales)/
((Aggr(Avg(MENUFACTOR),Unit,Period,FY))/
(aggr(min(Total <[Unit]> MENUFACTOR), Unit, Period, FY)))
These work in a table (except the MIN issue) because it's working on it a row at a time. But once I try and graph it and combine multiple locations the aggr function doesn't work.
What I need it to do is calculate the Menu Adjusted Sales for each period for each Location and then combine them by year on the graph.
Any ideas or questions?