Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All
Is anyone out there pulling large detail data into their QV models , by this I mean more that 350 Million rows of data, a good
example would be sales data at a granularity of second [time stamp].
I'm interested to know how to build an optimized data model with the above data, which approach is best to handle the decent reloads on this data.
ex: I have more than 45 fields in the table , in those 5 fields are having metric information, then in the remaining
remaining fields 23 fields I am using as dimensions in the qv app.
this table has rows more than 350 Million records for each year.
I would like to see the data up to half an Hour level [30 min] ,[aggregate data to 30 min] .
how can I achieve it in data model.
I'm interested in hearing about your stories here. Particularly and tips and tricks for performance.
I have read so many whitepapers on Best Practice model design, However I am expecting an experienced solution for this.
Many thanks in advance
John
Read this thread... Very well explained and even detailed level..
*** 6 Weeks in to QV Development, 30 Million Records QV Document and Help Needed!!! ****
You can also refer book... Mastering QlikView..!
Hi John,
Please have a look:
*** 6 Weeks in to QV Development, 30 Million Records QV Document and Help Needed!!! ****
Hi,
Especiallly follow the above link and devide your work into multiple levels
extraction
modification
final insertion
and then make it complete optimised load
in your final app dont use if , where ,loops
John,
Try this, *** 6 Weeks in to QV Development, 30 Million Records QV Document and Help Needed!!! ****, this was very helpful, even I used this when I was building my data model, It helped me a lot.
All the best bud!!