First create one or more documents that ETL' this data and do something useful with it. Your amount of data will probably shrink. But it may rise again due to the fact that you will not have one but many documents.
Then try to figure out how many users will want to visit these documents simultaneously.
Only then we can make an educated guess about how much memory your server will need to service QlikView users in a convenient way.
Another approach would be to configure an extensible system (like a VM or a machine with a few empty slots) and start with 24GB (minimum) in addition to a budget for more RAM in about 12m.
I would say there are three different types of memory allocation, which you need to keep in mind.
1. Memory used during application reload, for processing and storing the loaded data.
2. Memory used when application is opened. This is pretty much equal to the data model size in disc.
3. Memory used during application usage, for calculations and cached results.
1 will be larger than 2, since there are calculation done during reload that will not be pat of the final data model. 3 will not really not have an upper limit, as cached values will fill the memory until the upper working set limit. The memory used in 3 is flushed when application is reloaded, since then the cached results become obsolete. In 3 the memory consumption also very much depends on the aggregations you do int he application.
To get a proper estimation and answer you are best of with trying it out. For during data model allocation, you can load one field at a time to see what each of them allocate.