6 Replies Latest reply: Mar 31, 2015 11:28 AM by Peter Turner RSS

    Real QVD Dynamic Generation.

    Antonio Caria

      Hi,

       

      As anyone generated QVDs with more than 100 Millions of rows  and 10 columns of char(50) type with almost distinct values on each one ?

       

      Using:


      Eclipse Java 1.8  EE

      Laptop i7 with 32 Gb RAM and SSD disk

       

      Input Data from any Table from any database (jdbc).

       

      I have used in some cases encrypted columns (AE) but this  another discussion.

       

      I need compare performance with other installations.

       

      Tkx

        • Re: Real QVD Dynamic Generation.
          Onno van Knotsenburg

          Okay, you got me curious:

          What performance aspect are you trying to measure?

           

          Knowing that might result in alternative solutions.

          • Re: Real QVD Dynamic Generation.

            I doubt you'll be able to do this on this machine. I've tried to load ~ 400 million rows with a distinct key and this alone consumed about 8-10gig of RAM. Hence, 10 distinct char fields will be very tricky. As soon as you start getting rid of distinct values multiple 100 millions of records won't be a problem anymore (memorywise; QV only stores each value once and then uses bit-stuffed pointers to reference them in each record). What do you need 100 million distinct values for anyway?

             

            I reckon your QV hanging is caused by the system starting to swap to disk -- Check your min and max memory values in QV.

            • Re: Real QVD Dynamic Generation.
              Marcus Sommer

              I think this would be too much for your RAM. Beside thoughts if you really need all of these columns you should try to reduce the number of distinct values. Maybe with splitting from fields or replace from field-content with some logic - it's more difficult with strings as with numbers like here: The Importance Of Being Distinct - but with your ressources I believe you don't have options to a workaround like above mentioned.

               

              - Marcus

              • Re: Real QVD Dynamic Generation.
                Peter Turner

                I'm not sure there is an actual limit to what a QVD can contain, but more of a limit on the PC that is generating the table in RAM before it is able to store to disk.

                Qlik support might be able to tell you more, or you could always create a massive Amazon cloud instance of QlikView to run this test.

                 

                Otherwise maybe look at segmenting the large QVD into smaller (but still large) chunks of data, maybe 10million rows each?

                This could be useful when loading it back in as you might be able to select which files to load instead of loading the single huge QVD and filtering that for the data you want.