This is a very large file and if I were to use it I would use it in Optimized mode. There are two modes you can run a QVD in.
1. Standard which means you have Transformation a file when you load the QVD Like day(TempDate) as Day. This will changed the load into Standard mode which is slower than the Optimized mode.
2. Optimized means you are doing a straight load from the QVD and nothing is changed.
If you have to transform the file do it on the QVD creation load.
I can understand why the questions was asked in your interview..
This was to test your knowledge on prerequisites of loading QVD files and how much you are familiar on system configuration.
As per Evan's suggestion, One has to check the Server RAM before loading 200GB of data and of course a lot of patience. Without any second thought you have only one option to go is "Optimized Loads" which is you are giving additional burden to your system to fetch the data, in simple words.. its just dumping the data from your source without applying any filters or transformations. You can have where exists() clause if required.
Also, please read the following articles on Optimized and Un-optimized loads!
Simple answer its based on the Ram it should hold 200 GB size of Qvd and such may be used as ,
UN-Transformed - fully optimized But you can use where clause if required
Transformed - It will be slow . Tests patience.
But better way is to avoid such qvd's to load at Once ,
*For performance of the Application
*Just Pick the required information for a particular application.
EG:- Current year and previous year data
*where if the user requires the complete data means it will done using Dynamic Qlikview application .
QlikView doesn't have any limitations on the data to load instead it depends on the RAM size you have.
Generally, the amount of RAM consumption is 4 times the data size and also depends on the compression ratio applied.
Optimal techniques is to perform Optimal QVD which means there are no constrains when creating the QVD.
The best solutions are to perform incremental refresh of data, try to perform distribution of data by using document distribution.
Section Access can be helpful which limits the data to be accessible only to privileged users.
Hope it helps!
For any data load process there is some upper limit (that too in ideal conditions), if your RAM supports to handle that big data of 200GB, definitely this can be loaded..
Some of the best practices while loading the huge data..
1) Make sure that you are not creating too many joins and associations
2) Cardinality and Sparcity ratio
3) Make sure that not many UI elements are in your QVW
Have patience.. as its difficult to load in any BI tool of this much of data size...