Thanks for the answer. The 100 mil really is the result set after aggregation(far more rows in the actual tables). Also all aggregation is done in the DB. I am just loading the aggregated result from DB into QlikView and then into QVD. And still I am not able to accomplish this in any reasonable amount of time.
How long does it generally take to read say 10 million rows of 20 columns into QV?
It could depend on many factors, like your RAM size, CPU, db source(location), network trafiic, calculation complexity in the script to mention few. It could be as fast as 2 minutes; it could be never-ending(like in your case). Therefore, you should probably look into those factors and try to improve there.
Yes, Select query is aggregating. The script is something like the following:
Group by x,y
But it is my understanding, that the Select is handled at the database and and Qlikview just loads the results. Please correct if I am mistaken.
Also, in some other BI tools, there is an option of dynamic querying. The reports have a prompt page and depending on the values selected we will be able to filter the tables in the query. In other words, we never query the entire table only a subsection of it. Can we do something similar in QV?