I have a QV Server installed on a Win 2008 Server with 64 cpu s and 128 GB of ram. There are 4 documents published, but just 2 frequently used (preload) and about 1.7 GB on disk (compressed) each one. The main table has 40 million records on both. The server load with 3 of these documents opened, is about 23GB.
Now the users are 20, at the end they will be 100.
Sometimes it happens that the reports don't show the real data, i mean that a null record appear in any field i put in the report, but this is a fake null record because if I apply a filter it don't disappear. the only way to cancel it is to restart the qvs service in order to clear the memory (cache??)
I have worked with the usual working set till friday when i change it to prevent another problem, that we have when sometimes the memory usage (qvs.exe) grow to saturate the free memory. So i changed the default parameters 70/90/10 into 35/95/45 (low/high/cache).
Having several servers with 256 Gb memory . Played a lot with the memory limits parameters (70/90/10), noticed nothing.
Assuming that your QVW files are correct (no synonims, working formulas in charts) and contain no useless columns and rows. If you expect 100 users, there is no point waiting until somebody complains, than restart the service manually.
My solution is to automatically kill and restart QVS when using too much memory. You can measure statistically above what memory usage the QVS does not work anymore / does strange things as in your case.