Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi good afternoon everyone, I have a question regarding the volume of data that can work QlikView.
The question is this if I have a fact table 12 fields with 600 million records and five dimension tables, where each has two fields and contain 56 million records QlikView can handle this volume of data to generate reports without problems? and in the case that time may take more or less time to load this information into the memory of server?
Thank you very much!!!
If your dimension tables only have two fields consider using mapping tables with applymap to create a single table data model.
With large tables like this look at using an incremental load to improve load performance.
A very important point by large datasets is the cardinality of the data, see: The Importance Of Being Distinct.
- Marcus
Some organisations use Qlikview on terabytes sized source tables. With enough ram Qlikview can handle your data. Do look into using mapping tables to merge the values from your dimension tables into the fact table as Colin Albert says. And read the blog post Marcus referred to. If you have a datetime/timestamp fields you can probably split them in date and time fields and save space and time.
Thank you very much to everyone for your answers.The idea of transforming the data to save space and time is very good and innovative, I did not know this before.