I don't think you can calculate the reload memory
You can load a lot of data and then drop tables/columns before the end of the reload (and .qvw save on disk).
Also the size of the .qvw depends on the compression (none, medium, high) you choose.
The best attempt, imho is to try the reload with a subset of real data.
You cannot easily calculate the memory usage during reload. If there are joins in the script, a lot more memory will be used than otherwise. The best way is to look in the Windows task manager. The only thing that will use extra memory during Save is the compression. Turn the compression off, if memory usage is a problem. (Document Properties > General > Save Format)
The rule of "4 times the qvw size" is just a rough rule of thumb. The real memory usage is sometimes a lot more.
The memory statistics file described in Recipe for a Memory Statistics analysis will tell you the memory usage during analysis - not that during the reload.
Our experience is the same - use Task Manager and manually monitor.
With the QVCONNECT exe you can also see some big differences with the ODBC/OLEDB driver you are using. In my experience, I see db like MySQL pushing nearly 2 X the RAM of the final data.
Remember the QDS service can also give RAM stats on reload tasks; so if you have a TEST server you can do a series of reloads during testing to estimate the RAM requirements.
If profiling individual document reloads (for example as part of the final steps of a development track) is ok for you, and you do not plan to repeat this job too often, you could use QV Desktop and Windows Performance Monitor to make a memory profile during reload.
AFAIK (but I'm not 100% sure) the reload process in QV Desktop isn't that much different from the one executed by QVB.exe. If this assumption is reliable, you could even simulate a QVB reload by using the command line options of QlikView Desktop. One of them opens a QVW document, executes the reload script, stores the resulting QVW and exits. Looks a lot like what QVB does...
A command file for generating these profiles would contain things like:
- Figure out the document filename
- Create a configuration for windows performance monitor
- Start Windows performance monitor
- Wait for a baseline memory profile
- Reload the QVW document
- Store the Windows Performance monitor results
- Kill the monitor
If you choose the correct export format, you can now open the resulting data in QlikView and calculate a maximum RAM usage level for that particular document..
I admit that I like PowerShell more which would probably be able to do all the steps mentioned in one single run and just spit out a number in the end. YMMV