I've run into an interesting average problem and any help from you guys will be highly appreciated.
Basically I have 5 columns. Day, Date, Future Date, ID and service. For each ID we provide different services and record the date we provided service. We usually provide service consecutively and sometimes there are different dates. We flag this as Day 1, Day 2.
Also in Future, for each ID a future date is assigned. I want to calculate the average for each ID per day basis.
Lets have a look at the table first.
Now these are the future dates:
Days Future Date ID
So these dates and ID should be in straight table and when no future date is selected then this should be the average calculated:
But when the user selects 07/01/2017, as this date is assigned to Days 1 then only Days one past dates are used to calculate average.
Average = (4+10)/2
Similarly, if the user selects 15/02/2017 and as this date is assigned to Days 2 then only days 2 dates for this ID is used for average calculations.
Average: = (5+10) / 2 = 7.5
I know it's a bit tricky but this is where I need your help! I've attached both the app and data file if anyone is willing to help. Thanks