Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Big Data

Hi,

My source is single table from a database having multiple columns with millions of records, data will be increase on weekly basis.

can anyone please suggest me the best data model to handle this data, is it fine to split the table and create linked table, will this approach be correct??

1 Solution

Accepted Solutions
jagan
Luminary Alumni
Luminary Alumni

Hi,

If you don't have key fields then don't split the table maintain it in a single table. 

For example:

If you have

CountryID, Country, Date, Sales then split Country dimension table, if your data not in this format then do it with a single table.  Don't complicate the model.

Regards,

Jagan.

View solution in original post

12 Replies
jagan
Luminary Alumni
Luminary Alumni

Hi,

Try Incremental loading

Incremental Load


Regards,

Jagan.

Not applicable
Author

I can implement the incremental load later but now while starting the data model how should I approach, is it fine to divide 1 source table into multiple dimension and fact tables and then using link table, can you please suggest me on data model

Anonymous
Not applicable
Author

star scheme is preferable for the big amount of data

Not applicable
Author

so can I divide the table into dimensions and facts and then store all the keys in a linked table, is this approach good and how will be the performance...

jagan
Luminary Alumni
Luminary Alumni

Hi,

Star Schema will have better performance over single table so if possible split into dimension and fact tables.

Regards,

Jagan.

Not applicable
Author

If u hav the all flag(Insert and update) and Datemodified fields in ur fact table u can use the incremental load.

otherwise split the fact table in to several qvd using Year week, and update the last week qvd alone and concatenate the all qvd into one and use this qvd into ur document.

So that: Split into qvd like.

Fact_Table_2014-01.qvd

Fact_Table_2014-02.qvd

Fact_Table_2014-03.qvd

......

..... etc

and update the last week qvd,

Load * from Fact_Table where Yearweek(Date_field)=Yearweek(Today())

Store tab into Fact_Table_2014-03.qvd.

And Concate the all qvd into one.

Fact_Table:

Load * from Fact_Table_*.qvd

Regards,

Kabilan K.

Anonymous
Not applicable
Author

it is not necessary use linked tables in the star scheme, but sure depends from model. linked tables is bottleneck for the big facts

Not applicable
Author

thank you so much...

Not applicable
Author

yes I will use star schema, one more suggestion required, while generating keys, which one will be better Autokey or rowno, please suggest...