You will need enough memory to hold all your data. You can see the used memory in Settings/Document Properties/Sheets.
You might be able to use a subset of fields for the app or preaggregation.
For larger date ranges it might be possible to create a different data extract with less granularity.
1) What is the threshold data limit handled in RAM after which the data is moved to hard disk?
2) When we say that the data would be moved to hard disk does that mean that there would some impact in performance?
3) How does this tool's behave when the data volume is huge? pls share your thoguhts on your real time experience
1) on 32 bit OS you can access only 2 GB RAM. For 64 bit I do not know but assume QlikView is either not using virtual memory (however not sure whether an app will know it) or will be so slow that it won't make sense to run it. QlikView is an In-Memory BI tool and swapping will kill it.
2) certainly, I assume it will be unusable
3) what are the sizes you are talking about? If your data is huge your RAM needs to be huge. Depending on your data values your required RAM in QlikView is about 10-15% of a CSV file (about the size of the zipped information).
I assume also that your tables and graphs will take some time to be updated after changing selections, this depends however also on the complexity of calculations you have.
With enough ressources (money) I estimate that this slowdown of the response will be the limit and not the RAM you nowadays can add to a box.
I also assume this is not the exact answer you expected, maybe someone else has an opinion?