Open up your biggest application in the QV desktop client, go to task manager and check how much ram it is using. Now multiply that by how your total CAL's. This is a rough guide as to the absolute maximum RAM you will need. You can then add 10-20% for growth of your userbase or application or deduct 10-20% for budget constraints.
As a real world example, i have a 5 million row document across a star schema of 12 tables that consumes 400mb RAM
Your server could handle 30 instances of that document easily with overhead left for OS and a couple of reload engines.
Bear in mind this is for an access point distribution system, if all your users are opening via the qv client then it's the ram on their local machine that counts.
More RAM does not increase the speed of reloads greatly once you are above 4gb, generally that is drive speed, cpu architecture and software engine. My dev machine is an 8gb i5-650 and it reloads with qv desktop 2-3 times quicker than our production server which is a 4x6core opteron with 128gb running qv server 10 sr2 however when our server was running qv9 sr5, both reloaded the same document in the same time.
It will largely depend on your application as well as the hardware and software. Running pivot tables with millions of rows and calculated dimensions may cause crashes in the application, while with few hundreds it may work just a bit slower than usual. High timeouts will affect as well. Several reloads at the same time, browser versions, number of licenses, OS updates/patches...
Some logs from the server in addition to the events recorded in the Windows log would be useful to trace the issue and give you some tips on how to improve that......
For the better understanding I am attaching the PDF for getting how much the Ram is used............
Hope this will help you