The application will simply not run and would crash if the data size is larger than that of the RAM size .
There is no impact of use of SSD except for first time data pull into RAM , but after the application is loaded with data there will be no impact when users perform analysis or make any selections.
PS : You can look into direct discovery for very large data volumes.
If the size of your qlikview application is bigger than your RAM it won't work. Even it's a bit lesser it won't make much sense because you need further RAM to calculate thing and to cache these calculations. The important point here is the (uncompressed) size of the qvw and not the size of your raw-data. The reason is that qlik used a quite effective way of storing the data which often results in compression-rates of 80-90% - essentially depending on the cardinality of the fields.
There is no practically way to pre-calculate the needed RAM consumption only with the raw-data size or the number of records - it will always depend on the data itself and how the datamodel is designed and the RAM consumption is especially not linear with the amount of data.