This totally depends on the amount of data in the applications.
If you have a few million rows of data per application it will be no problem at all.
If you have many billions of rows of data and very complex data models, it might be a serious problem.
Ask your boss if you can do a "Performance and Scalability" training @ Qlik to learn the basics and about setting up tests.
I guess Go means GB ?
Does 6GB mean 6GB file size of QVF? The important thing is how much it turns into when load in memory. Then you can extrapolate from that adding 10% overhead per concurrent user. Assuming that 6GB of file size then it could expand into 4 to 10 times the size it has on disc - making it in worst case like 60GB in memory. Assuming that you have a concurrent ratio of 10 then it will add 50% to your application which amounts to 90GB.
If you have a full 50 users running concurrently then it 5 doubles the size which is like 300GB but then again the memory footprint might not be 60GB at all maybe only 24 GB and with 50 concurrent users that will be 120 GB...
I make a lot of assumptions here and doing real-world tests is really the best as Onno suggests.