As far as I know, there is no maximum memory limit for QV. QV will take all available memory regardless of how much you have. It depends on the size of your data, number of users, server status,etc. So it is the best practice to test with applications like document analyzer/optimizer on Access Point to estimate how much RAM and Processing power you will need.
QVS does not use all available memory, in reality. Memory consumption is governed by the QVS Working set limit settings. They default as Low/High/Cache=70/90/10, and is a pretty descent value for most configurations - but not for larger ones.
The values are % and represents limits to where QVS can "go" in memory. 70%/90% of 256GB RAM is ~180GB/~230GB. That means that QVS cannot use more 180, and that it will definately never use more than 230 (note that this is a *very* cut-down explanation of the memory management in QVS). Windows will very seldom need more than, at most, 2GB for it's own processes, and since it is a dedicated QVS machine, there are probably no other major processes that need that, otherwise unused, memory.
You can up the ws limits to cover more memory on a system like that.
If you are having trouble with a tiny data set of a couple of million rows (in a Qlikview world that is truly tiny) on a 256 GB system, then the problem is likely in your data model or anywhere else in the system - not in Qlikview's ability to use memory.