Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
sridhar_sigired
Creator
Creator

Many to many relation data loading issue

Hi All,

I have a FACT Table with below information.

 

DateUIDSales
11000
11000
1156
126
1258
2285
3100
31458
3265
42558
41245
42655
4254

Date dimension table line below

     

DateUIDDateTimeSlotHourTime Zone PST DateTime Zone IST Date
13/3/201613/3/20163/4/2016
23/3/201623/3/20163/4/2016
33/3/201633/3/20163/4/2016
43/3/201643/3/20163/4/2016
53/3/201653/3/20163/4/2016
63/3/201663/3/20163/4/2016
73/3/201673/3/20163/4/2016
83/3/201683/3/20163/4/2016
93/3/201693/3/20163/4/2016
103/3/2016103/3/20163/4/2016
113/3/2016113/3/20163/4/2016
123/3/2016123/3/20163/4/2016
133/3/2016133/3/20163/4/2016

I need to link DateUID.... when i am linking it is Many to One relation (Fact to Dim), that is fine.

However i am dividing Date dimension table based on Timezones (like PST and IST) and concatenating them... so that i will show all timezone in a list box for selection.

Now that problem is it is many to many relation and while loading... it is taking much much time... of course i dont even know whether it will work out or not.

Can anyone suggest how can we fulfill this requirement?          

Thanks in advance.   

1 Solution

Accepted Solutions
Anonymous
Not applicable

The attached qvw seems to work, but I am not sure about the combined calendar for all time zone you mention.

Could you clarify what is not working for you.

View solution in original post

10 Replies
Anonymous
Not applicable

I would suggest do the grouping in fact table and link it to your DIM table which would be 1:1 relationship like this?

FactTable:

Load

DateUID,

Sales

From FactTable;

noconcatenate

FinalFact:

load

DateUID,

sum(Sales) as Sales

resident FactTable

group by DateUID;

drop table FactTable;

Now FinalFact would be link to Dim Table into 1:1, hope this will help!!

sridhar_sigired
Creator
Creator
Author

Due to concatenation of my Dimensional Data table for time zone wise, it is also showing each UID many times (like 8 time zones are there, i am concatenating 8 times, so 8 same UIDs).... it will be 1:M then...

Apart from this grouping any other suggestion... if i do grouping, majority of my KPIs need to modify... they are based on count()...

Anonymous
Not applicable

It will be 8 times only as with Each UID with different Timezone will make a unique combination which I dont think that avoidable.

Let wait for some more opinion from other experts

sridhar_sigired
Creator
Creator
Author

Can anyone suggest me on this?

How can we achieve my requirement.

MayilVahanan

HI

Instead of concatenate for Timezone, you can use Join right?

Thanks & Regards, Mayil Vahanan R
Please close the thread by marking correct answer & give likes if you like the post.
sridhar_sigired
Creator
Creator
Author

Hi,

The DateTime Table is already joined, which we are dividing and concatenating.

If i use this joined table as it is, i cannot show the list of time zones, through which our calendar need to filter.

I want a combined calendar for all time zone, so that  i can show one time zone with 'always one selected value' from list box.

So that end user can select whatever time zone he want.

Anonymous
Not applicable

The attached qvw seems to work, but I am not sure about the combined calendar for all time zone you mention.

Could you clarify what is not working for you.

sridhar_sigired
Creator
Creator
Author

Hi Bill,

I have tested this logic with sample data before i implement with production data.

Yes, this is working.

I don't know the reason why is it not working with big data set...

Fact:

DateUID;

LeftKeep

DIM:

DateUID;

I am using left keep with key field... and after few hours it is getting failed...

Log is showing that keep/join error....

My Date Dimension have 2017 Dec31st, so thats why i am using left keep...

So that may be that reason...

How can i avoid this with huge data?

Thanks for your time Bill.

Anonymous
Not applicable

Could you share the log file from your reload.