Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
schmidtj
Creator II
Creator II

Generate range data for minutes?

Hi,

i have a question/problem concerning range data.
I have the following test data:

[DATA]:
LOAD * INLINE
['User', 'Zone', 'Datetime entry', 'Duration', ''
'User 1', 'Zone 1', '2020-01-31 00:26', '00:30',
'User 1', 'Zone 1', '2020-01-31 14:00', '01:10',
'User 1', 'Zone 1', '2020-01-31 18:15', '02:23',
'User 1', 'Zone 1', '2020-01-31 23:30', '00:10'
];

The entry of each user into a zone is logged.
The datetime of that entry up to the minute und after i have a duration how long the user stayed in that zone.
The goal at the end is to select ranges in a line chart and count the distinct number of users in that timerange.
I was wondering how to accomplish that with the data at hand and thought one idea would be, to range up the duration per minute.
In other words if i take for example the first row, i would generate 30 new rows for each minute after 00:26.

Is that a good ida or does somebody have a better one? I could imagine is quite ressource-consuming.
If yes, could somebody give me some advice on how to do that the best way?

Thanks a lot!

1 Solution

Accepted Solutions
sunny_talwar

Instead of generating each minute with all data in it... you can look into performing an IntervalMatch 

View solution in original post

5 Replies
sunny_talwar

Instead of generating each minute with all data in it... you can look into performing an IntervalMatch 

schmidtj
Creator II
Creator II
Author

Thanks for the replay.

I tried that, but it is quite heavy on the server performance.

Is there maybe another more slim way to do that?

sunny_talwar

As far as I know... I don't think there is any way which would give you good performance... what you are wishing to do is resource intensive... don't think you can do too much about it. But lets see if @marcus_sommer have any suggestions.

marcus_sommer

I think Sunny is right and it's in general a rather resource intensive transformation. But it doesn't mandatory mean that this kind of transformation and the related calculation couldn't work sufficient within an application.

Like always a good performing application depends on a well-designed datamodel and effective calculations within the UI. Personally I could imagine to create the needed table as a dimension-table (although it from the kind of data is also a fact-table) to the fact-table. Further I would try to split the timestamp into a date- and a time-field - it may not be very easy if the entry + duration leads to a day-shift - and would also use autonumber() for the key.

Another way to reduce the needed dataset could be to cluster the times into clusters maybe of 30 minutes or something similar. For many views on the data would complete variable time-frames not create a real added value.

- Marcus

schmidtj
Creator II
Creator II
Author

Ok, thanks a lot for the insight!