Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
johanfo2
Creator
Creator

Normalized vs deNormalized load

I have a brief question regarding the datastructure of QlikView tables. 

I'm accessing a data warehouse.  In my load script I have joined all the dimentions into one BIG facttable for load.  It is about 120 Million rows and has 30 columns.  

My question is: Is this a OK solution?  Or shoud I split it in several dimention tables?  What are the advantages/disadvantages from a QlikView point of view?

How do you do it?

Thanks

1 Solution

Accepted Solutions
erichshiino
Partner - Master
Partner - Master

Hi,

The big fact table will probably give you the best response time for selection but it can be a little bigger (although QlikView will handle the duplication of values inside a field very well).

On the other hand, a snowflake schema may have a bit longer response time to selections since they will have to be propagated to other tables.

You can also get good results in a star-schema.

I`d say that the best choise will depend on your bottle-necks...

Regards,

Erich

View solution in original post

2 Replies
shumailh
Creator III
Creator III

Hi Johan, Normally in Datawarehouse data is in denormalized form because of huge data, as far as QlikView is concern the tool is very flexible so you can use denormalized form to get fast response on huge data.

Cheers

Shumail

erichshiino
Partner - Master
Partner - Master

Hi,

The big fact table will probably give you the best response time for selection but it can be a little bigger (although QlikView will handle the duplication of values inside a field very well).

On the other hand, a snowflake schema may have a bit longer response time to selections since they will have to be propagated to other tables.

You can also get good results in a star-schema.

I`d say that the best choise will depend on your bottle-necks...

Regards,

Erich