I have time based data that contains a few key columns as follows:
Time Stamp, device_id, status, latency, region, duration. Duration unit here is in seconds.
The status of the device could be in network or out of network.
The of status and duration indicate the status of the terminal, either in network or out of network and the duration simply shows the duration of that device either in network or out of network.
I'd like to create a chart showing total duration (in hours) of the device id when it is still considered in the network but the latency value is zero or blank. As it is reported periodically duration value keeps changing, so I'd like to select the maximum duration only for each device for each observation period. Tried set analysis like below with max value but it is not working.
If you see the script, I have ruled out other than IN_NETWORK status for calculation of duration, as you said I have removed the customer_id dimension. Now I think the chart sums the max values customer wise for a timestamp dim. I think now you can take it from here.
Sharing some screen prints to prove my stand -
If you see I have selected 2 customers, specific timestamp also selected just to verify the data, with latency 0 selected, also having duration >0 selected, the status column verifies that only IN_NETWORK status are selected. It proves that script is working fine.
Now here is the chart -
as per first image -
Max duration for first customer - 1845 secs
Max duration for sec customer - 15477 secs
Sum of max against time period = 4.81 hrs or 17322 sec, the chart also shows the same amount.