4 Replies Latest reply: Jul 29, 2013 8:19 AM by niklashjern RSS

    Problem exporting large datasets as .csv

      Hi!

       

      I've been having some problems regarding exporting files from Qlikview to a .csv file. My project is to build a simple interface where the end-user can easily select which data to export based on a number of preferences such as where a customer lives, how big his house is etc. The data in question is daily measurements over three years, so each customer has for every date one measurement. The exported file should be in the form of a table where one "axis" holds the customer IDs and the other "axis" holds the dates. I found a way for Qlikview to create this table in the form of a pivot table, so far so good.

       

      Example:

       

      Date/CustomerID111113111114111115
      20120101456
      20120102121212
      20120103151515
      20120104111

       

      The problem comes when I de-select every preference and try to display the pivot table with all values for every customer. Then Qlikview hangs and displays the error message:

       

           OUT OF VIRTUAL AND/OR LOGICAL MEMORY, allocating     2 MB

       

      When I have the table minimized and try to export it by right-clicking and choosing export as .csv I get the message:

       

           Internal inconsistency, type D, detected.

       

      So, my question is; Is there a way to export a pivot table with all the data or do I have to find another way of creating the needed output? The real problem lies in the fact that the dataset I'm currently working with is just a small part of the total set, so the solution should work with files that are pretty much of arbitrary length.

       

      The data consists of one table holding the measured values, dates and customer IDs, roughly 4 000 000 lines, and another table containing customer information, roughly 12000 lines. These two are linked together with the customer ID as a key.

       

      I might also add that I am completely new to Qlikview and databases in general, I do however otherwise have pretty good computer/programming knowledge.

       

       

      Would really appreciate some help and please let me know if there is some vital information missing from my post!

        • Re: Problem exporting large datasets as .csv
          Miguel Angel Baeyens de Arce

          Hi,

           

          That seems a lot of information to be displayed in any object, and specially if you try to show those 4 million rows from a laptop. I'd first try with a smaller number of rows, say 100 K then 500 K, and see what happens, if you keep seeing that error or not.

           

          I'd try that in a Server with Excel 2007 or above installed.

           

          Hope that helps.

           

          Miguel

            • Re: Problem exporting large datasets as .csv

              When I'm not trying to display all the rows everything works fine. Also, I am able to display and export the entire dataset when I'm not using a pivot table, so it seems that the pivot table uses a lot more virtual memory, is this the case?

               

              The reason I chose Qlikview is that I was told that it would be able to handle these large datasets better than for instance Microsoft Access, which has a limit of 2 GB. As I said, the complete dataset will be more than double the size of the current one, around 10 million lines, will I be able to use Qlikview at all with these datasets? Also a dedicated server is out of the question I'm afraid, everything has to be able to run on a single computer where the data, located in .QVD files, can be copied from a network to the computer that will then run my Qlikview "interface" to extract the wanted data.

               

              Niklas

                • Re: Problem exporting large datasets as .csv
                  Miguel Angel Baeyens de Arce

                  Hi Niklas,

                   

                  Indeed, QlikView can handle large datasets in memory, you can read about real cases of handling several hundreds of millions and even billions of rows running in one or several servers.

                   

                  But being in memory means that you need memory enough as to keep all the associative model on it, and here you are right, pivot tables are not the best objects in terms of performance. Note that pivot tables need to be able to do all aggregations for several dimension combinations, and that takes extra memory and time instead of using straight tables.

                   

                  Anyway, QlikView is not a reporting tool or a visualization tool, rather than an analytical tool, so displaying millions of rows in one pivot table would not make much sense.

                   

                  The model you layout using QVDs is the right one, and 10 million rows in an application should run just fine in a laptop or desktop computer. Actually, dozens of millions work all right in my virtual machine.

                   

                  Hope that makes sense.

                   

                  Miguel

                    • Re: Problem exporting large datasets as .csv

                      Ok, then at least I know my efforts will not be in vain! Thank you!

                       

                      Now however, if we disregard trying to view the pivot table and only try to extract it as a .csv file, which is my end goal, I still get the error "Internal inconsistency, type D, detected." I really want to be able to extract the chosen data as a pivot table (or some other way that gets the end result seen in my original post), is this possible or does Qlikview still have to have the table "in memory" when extracting it, thereby hitting the same problem as when trying to view it?

                       

                      Do you know of another way to extract the data on the needed format that can later be imported to Excel? I think that if the exported file if on this form the number of rows and columns should be below what Excel can handle, roughly 30000 rows as hours of the day and 12000 columns as customers. However, if it is discovered that the datasize of the exported file is to large for Excel, is there a way to "warn the user" that this will be the case before exporting? I'm thinking in the form of a "warning light" connected to a macro that lights up then the data to be exported is above a certain size?

                       

                      Again, thank you for your help!

                       

                      Niklas