Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

DataModel

Hi All

Is anyone out there pulling large detail data into their QV models , by this I mean more that 350 Million rows of data, a good

example would be sales data at a granularity of second [time stamp].

I'm interested to know how to build an optimized data model with the above data, which approach is best to handle the decent reloads on this data.

ex:  I have more than 45 fields in the table , in those 5 fields are having metric information, then in the remaining 

remaining fields 23 fields I am using as dimensions in the qv app.

this table has rows more than 350 Million records for each year.

I would like to see the data up to half an Hour level [30 min] ,[aggregate data to 30 min] .

how can I achieve it in data model.

I'm interested in hearing about your stories here. Particularly and tips and tricks for performance.

I have read so many whitepapers on Best Practice model design, However I am expecting an experienced solution for this.

Many thanks in advance

John

4 Replies
MK_QSL
MVP
MVP

Read this thread... Very well explained and even detailed level..

*** 6 Weeks in to QV Development, 30 Million Records QV Document and Help Needed!!! ****

You can also refer book... Mastering QlikView..!

Anonymous
Not applicable
Author

Not applicable
Author

Hi,

Especiallly follow the above link and devide your work into multiple levels

extraction

modification

final insertion

and then make it complete optimised load

in your final app dont use  if , where ,loops

qlikrambo1
Contributor III
Contributor III

John,

Try this, *** 6 Weeks in to QV Development, 30 Million Records QV Document and Help Needed!!! ****, this was very helpful, even I used this when I was building my data model, It helped me a lot.

All the best bud!!