If you plug 10TB as a raw figure into a formula to calculate RAM needs you will most likely grossly over estimate your true RAM requirements for QLIK.
Its important to think about data requirements in terms of a limited data model not a raw database.
That doesn't come across in the documentation where we need to do a better job.
A bit more: a single QlikView app is supported by a single data model that will cover a series of dimensions and calculate a series of KPIs. It will answer a 'family' of questions about that data model. It will give a user access to any intersection or possible summary in that data and its very easy to overload a user with information .
If you are familiar with BO , Cognos or any other enterprise reporting platform a customer will likely have more than 1 data model per database to support different families of questions. Building a giant model (in any tool) off a large database becomes harder and harder to serve all needs and ultimately becomes an exercise in compromising everything from agility to performance. Its better and mandatory in qlikview to address a sizable chunk of data in pieces that users can manage.
In Qlik , for a 10TB database you will likely need more than one data model to 'package' all the information into a format that a user can actually consume. That means 'pieces' of the database for each qlikview app and sizing the pieces (probably just the first piece to begin with) and not something like 10TB disk = 4TB RAM
i'm not saying there aren't the odd use case that needs a very deep (many rows) data set. They exist. But they aren't the norm hence my comment.