I don't understand what do you want to do with the bridge-table. Normally it wouldn't be needed to link a fact-table to a master-calendar. For many different date-fields you could use these approach: Canonical Date.
My data model contains a slowly changing dimension called "Transfers". It shows the periods where a member belongs to a certain "Home Club" (Numerical value). It's a table with: From_Date, To_Date , Person_ID, Home_Club_ID.
What I'm trying to do is to be able to choose a point in time (a day), and be able to count how many people are listed at each "Home Club" at that point in time.
I used code from the post by HIC in :
In order to be able to link each day to the appropriate periods.
Is this the correct approach to what I'm trying to do?
I think that is the correct article and approach. To handle the open ended dates, I assume that you mean that TO_DATE would be null. You could try something like:
While IterNo() <= Alt(TO_DATE - FROM_DATE + 1 , 90);
Select a value that makes sense in your data set, or else try:
While IterNo() <= Alt(TO_DATE, Today()) - FROM_DATE + 1;
to use today's date as the 'fill-in' for null TO_DATEs
Thanks for your reply,
I thought about taking a value which is relative to 'Today' as the end date (E.g today + 5 years), but I was concerned about the efficiency of having to update all (or most of) my periods every day. I was hoping to have a setup where I only import new data and 'Changes' to existing entries, instead of reloading my whole data set on a daily basis.
I was thinking this is a strange way to approach this. Eventually I will end up with a table that has an entry for each 'person' and each day since he started. So basically the size is based on people multiplied by days. When I have 2 million people over a period of 10 years, it sounds like a whole lot of data, that should be optimized somehow.
I know that Qlik has some good methods to normalize and compact data. Do you know how feasible it is to work with a data set like that?