it is not possible to give an exact answer. Two things are important: The RAM that QV will need and the CPUs. You need CPU during calculation and when more and more users are concurrent online. A rule of thumb is about 10% RAM more per user. Your Ram depends only on your data. The structure of your data, the density, the cardinality. So for me one most important thing is the scalabilty of your system. The worst case is that the requirements are growing up and you can not extend your system.
With this in mind I think you can start with either HP-engines. But start in any case with Win 2008 64bit (!). With a 32bit OS you can adress max. 4GB. In reality the OS itself needs RAM, I/O-Buffering, and so on. And I wouldn't share the server with any other server-software, especially any database server. They need a lot of RAM as well.
Last not least contact your local sales-rep from QT or from your QT-Partner.
We too are about to deploy QV and are trying to define the server CPU & RAM specs.
Could I ask you to clarify your comments? You say "about 10% RAM more per user" and go on to say "Ram depends only on your data. The structure of your data, the density, the cardinality."
Do you mean we need to add 10% of 4GB (400mb) for each user?
Could you give some indiaction of how much RAM would be required to analyse a 20GB data warehouse?
The data model takes up a certain amount to RAM which depends on what Roland mentions. Then on top of that each concurrent user is going to use a certain percentage of RAM that the charts they are generating need to use.
For an estimate, download the excel in the following post: