That seems a lot of information to be displayed in any object, and specially if you try to show those 4 million rows from a laptop. I'd first try with a smaller number of rows, say 100 K then 500 K, and see what happens, if you keep seeing that error or not.
I'd try that in a Server with Excel 2007 or above installed.
Hope that helps.
When I'm not trying to display all the rows everything works fine. Also, I am able to display and export the entire dataset when I'm not using a pivot table, so it seems that the pivot table uses a lot more virtual memory, is this the case?
The reason I chose Qlikview is that I was told that it would be able to handle these large datasets better than for instance Microsoft Access, which has a limit of 2 GB. As I said, the complete dataset will be more than double the size of the current one, around 10 million lines, will I be able to use Qlikview at all with these datasets? Also a dedicated server is out of the question I'm afraid, everything has to be able to run on a single computer where the data, located in .QVD files, can be copied from a network to the computer that will then run my Qlikview "interface" to extract the wanted data.
Indeed, QlikView can handle large datasets in memory, you can read about real cases of handling several hundreds of millions and even billions of rows running in one or several servers.
But being in memory means that you need memory enough as to keep all the associative model on it, and here you are right, pivot tables are not the best objects in terms of performance. Note that pivot tables need to be able to do all aggregations for several dimension combinations, and that takes extra memory and time instead of using straight tables.
Anyway, QlikView is not a reporting tool or a visualization tool, rather than an analytical tool, so displaying millions of rows in one pivot table would not make much sense.
The model you layout using QVDs is the right one, and 10 million rows in an application should run just fine in a laptop or desktop computer. Actually, dozens of millions work all right in my virtual machine.
Hope that makes sense.
Ok, then at least I know my efforts will not be in vain! Thank you!
Now however, if we disregard trying to view the pivot table and only try to extract it as a .csv file, which is my end goal, I still get the error "Internal inconsistency, type D, detected." I really want to be able to extract the chosen data as a pivot table (or some other way that gets the end result seen in my original post), is this possible or does Qlikview still have to have the table "in memory" when extracting it, thereby hitting the same problem as when trying to view it?
Do you know of another way to extract the data on the needed format that can later be imported to Excel? I think that if the exported file if on this form the number of rows and columns should be below what Excel can handle, roughly 30000 rows as hours of the day and 12000 columns as customers. However, if it is discovered that the datasize of the exported file is to large for Excel, is there a way to "warn the user" that this will be the case before exporting? I'm thinking in the form of a "warning light" connected to a macro that lights up then the data to be exported is above a certain size?
Again, thank you for your help!