Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Open ended periods VS Calendar

Hello,

I'm fairly new to Qlik, and I'm working on designing my data model. I've been trying to set up a model where I have a period and I'm linking this period to a master calendar table.

While I tried doing this with periods with an end date, it worked fine. However when I included open ended periods in my data set, I came across an issue. My code for creating the bridge table between the Transfers table (trans) and the Master Calendar table

// ============ Create the Transfers_Calendar ============

Trans_Calendar:

  Load

      FROM_DATE as Trans_DATE,

      Month(FROM_DATE) as Trans_MONTH;

  Load

      Date(MinDate+iterno()) as FROM_DATE

      While iterno() <= MaxDate - MinDate ;

  Load

      Min(FROM_DATE)-1 as MinDate,

      max(TO_DATE) as MaxDate

  Resident TRANS

  ;

// ============ Create the bridge table between the Transfers and the Transfers_Calendar ============

Trans_x_Dates: 

Load

    TRANS_ID,

    left(Date( FROM_DATE + IterNo() - 1 ),10) as Trans_DATE

    Resident TRANS

While IterNo() <= TO_DATE - FROM_DATE + 1 ;

It's pretty obvious that it wont create a link when the end date is a null.

I've tried looking into a solution where I create a fictional end date (like 2099-12-31), but that increases the size of my data by a very large factor.

This is the most common scenario for me, as I use the periods to show changes in a 'Status' of a client (Active, Inactive, Frozen) and most of the periods don't have an end date.

I wanted to ask what is the best practice for handling open periods?

Thanks beforehand  

5 Replies
marcus_sommer

I don't understand what do you want to do with the bridge-table. Normally it wouldn't be needed to link a fact-table to a master-calendar. For many different date-fields you could use these approach: Canonical Date.

- Marcus

Not applicable
Author

Hey Marcus,

My data model contains a slowly changing dimension called "Transfers". It shows the periods where a member belongs to a certain "Home Club" (Numerical value). It's a table with: From_Date, To_Date , Person_ID, Home_Club_ID.

What I'm trying to do is to be able to choose a point in time (a day), and be able to count how many people are listed at each "Home Club" at that point in time.

I used code from the post by HIC in :

Creating Reference Dates for Intervals

In order to be able to link each day to the appropriate periods.

Is this the correct approach to what I'm trying to do?

marcus_sommer

I think these links fit better in your case:

IntervalMatch

IntervalMatch and Slowly Changing Dimensions

- Marcus

jonathandienst
Partner - Champion III
Partner - Champion III

I think that is the correct article and approach. To handle the open ended dates, I assume that you mean that TO_DATE would be null. You could try something like:

While IterNo() <= Alt(TO_DATE - FROM_DATE + 1 , 90);

Select a value that makes sense in your data set, or else try:

While IterNo() <= Alt(TO_DATE, Today()) - FROM_DATE + 1;

to use today's date as the 'fill-in' for null TO_DATEs

Logic will get you from a to b. Imagination will take you everywhere. - A Einstein
Not applicable
Author

Thanks for your reply,

I thought about taking a value which is relative to 'Today' as the end date (E.g today + 5 years), but I was concerned about the efficiency of having to update all (or most of) my periods every day. I was hoping to have a setup where I only import new data and 'Changes' to existing entries, instead of reloading my whole data set on a daily basis.

Also,

I was thinking this is a strange way to approach this. Eventually I will end up with a table that has an entry for each 'person' and each day since he started. So basically the size is based on people multiplied by days. When I have 2 million people over a period of 10 years, it sounds like a whole lot of data, that should be optimized somehow.

I know that Qlik has some good methods to normalize and compact data. Do you know how feasible it is to work with a data set like that?