yes and no - there is probably no way to solve your performance issue - IT-specific actions aside - than to just get your fact_table smaller which can basically only be done by either reducing the nr. of fields you are loading (find out about unused fields using the DocumentAnalyzer (a qvw available here in the Community))
or by reducing the timeline (e.g. loading data for only the current year)
Instead of reducing rows, try reducing columns. Check which columns can be disregarded, because besides the number of records, another thing that affects the size and performance of your data are the columns.
Also if you're experiencing perfomance issues after doing some optimizations (working in a 3-4 data tier, using optimized qvds, avoid using nested ifs in expressions, avoiding using calculated dimensions and instead precalculating as much as possible in script, etc) then next solution would be to increase resources in your server (adding more ram for example),
There are a few things you might want to check in order to help you optimize your document:
Eliminate unnecessary fields with unique values such as unused keys or time-stamps
If time-stamps are absolutely necessary, split them in several fields in order to avoid large indexes on those fields. The more different values you have in a field, the bigger the size of the index, hence, the bigger your QlikView document. You may opt to break them down to date, hour, minutes, seconds, milliseconds. Often enough, you will realize at this point that milliseconds and seconds are quite often not necessary.
Add counter fields instead of using Count(DISTINCT ...). By adding a field with 1 as value to your table, you may use the Sum() function instead, which is a lot faster.