It is not possible to tell from these numbers, since I suspect the numbers refer to the amount of data in the database and this usually includes database indexes that QlikView does not load. Also, you do not mention anything about which calculations that the application will need to perform. Or if you want to partition the data into one application per month, as opposed to having the entire year in one file.
But let me make some rough estimates: First, QlikView compresses this data (420GB) and may perhaps need 4GB of RAM for the data model. (Or perhaps 40GB or 100GB or more - if the cardinality of the data is high). Then, each pivot table will need additional memory, so if you make pivot tables with many dimensions and complex calculations you can easily end up with an application that needs 500GB.
The best way for you to find out is to make a QlikView document and see how much RAM QlikView uses. From this you can extrapolate the basic memory need - if you double the amount of data, then the RAM usage will also double.
Then a rough rule of thumb is that you need to add 5-10% for each additional user. Example: If QlikView uses 10GB of RAM when the file is loaded, then 50 users will need approximately 25-50GB of additional RAM.
Thanks a lot Henric for your intime support.
Based on few assumptions I have done below calculation:
RAM Size Calculated based on data size and Number of concurrent users.
Reporting suite data size – Overall transformed data size would be around 32 Gig , if we put it on Qlikview its close to around 6.4 Gig QVD Files (Mulitple QVD’s -Considering 80% compression ratio)
Let’s assume the QVW file size is around 6.4 (Considering will connect all these QVD’s)
Initial RAM size = QVW size * File SizeMultiplier – This is the initial RAM required for any application.
- RAM user =RAM initial × user RAMratio - This is the RAM each incremental user consumes
- QVW size disk = Source Data × (1 – Compression Ratio); this is the size, on disk, of a QlikView file
Initial RAM Size = 6.4 MB * 4 = 25.6 Gig
RAM user = 25.6 Gig * 10% = 2.56 Gig
For 50 Users approximately 153 Gig RAM is required.
User RAM Ratio – 10% (Standard Range – 1% to 10%)
File Size Multiplier – 10 (Range between 2 – 10)
Compression Ration - 80% (0 to 90%)
Any thoughts on this...
Looks OK to me, but be aware that it is just a very rough estimate! And...
* Compression ratio is often >90% if the number is compared to the DB size
* QVD:s take a lot more space than QVW:s for the same data
* Is it 50 users or 50 concurrent users? 50 users (in my experience) correspond to 2-5 concurrent users.
These three bullets may indicate that 153 GB is an overestimation. On the other hand - one single pivot table could easily double the RAM need.
Further - this just estimates the RAM. To estimate the number of processors needed, you should make a test application. If you e.g. have a pivot table with approximately 1s calculation time using one core only, and your users click 6 times per minute, then you need (with 50 concurrent users)
50*6*1 = 300 CPU seconds per minute
I.e. you will need 5 cores just for this pivot table...