Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Real QVD Dynamic Generation.

Hi,

As anyone generated QVDs with more than 100 Millions of rows  and 10 columns of char(50) type with almost distinct values on each one ?

Using:


Eclipse Java 1.8  EE

Laptop i7 with 32 Gb RAM and SSD disk

Input Data from any Table from any database (jdbc).

I have used in some cases encrypted columns (AE) but this  another discussion.

I need compare performance with other installations.

Tkx

6 Replies
oknotsen
Master III
Master III

Okay, you got me curious:

What performance aspect are you trying to measure?

Knowing that might result in alternative solutions.

May you live in interesting times!
Anonymous
Not applicable
Author

* Execution Time.

* RAM used. Of course You can force all RAM available

* CPU % is not important.

In 99% users using QV Desktop it will hangs when generate a file with 100 M rows.

nafi_qlikview
Partner - Contributor II
Partner - Contributor II

I doubt you'll be able to do this on this machine. I've tried to load ~ 400 million rows with a distinct key and this alone consumed about 8-10gig of RAM. Hence, 10 distinct char fields will be very tricky. As soon as you start getting rid of distinct values multiple 100 millions of records won't be a problem anymore (memorywise; QV only stores each value once and then uses bit-stuffed pointers to reference them in each record). What do you need 100 million distinct values for anyway?

I reckon your QV hanging is caused by the system starting to swap to disk -- Check your min and max memory values in QV.

marcus_sommer

I think this would be too much for your RAM. Beside thoughts if you really need all of these columns you should try to reduce the number of distinct values. Maybe with splitting from fields or replace from field-content with some logic - it's more difficult with strings as with numbers like here: The Importance Of Being Distinct - but with your ressources I believe you don't have options to a workaround like above mentioned.

- Marcus

Anonymous
Not applicable
Author

I'm trying to push the QVD to the limits.

10 Gb RAM is good for 400 millions rows.

Of course We can create n QVD (n is the number of columns) for just one table with 2 columns: id(row) and col.

peter_turner
Partner - Specialist
Partner - Specialist

I'm not sure there is an actual limit to what a QVD can contain, but more of a limit on the PC that is generating the table in RAM before it is able to store to disk.

Qlik support might be able to tell you more, or you could always create a massive Amazon cloud instance of QlikView to run this test.

Otherwise maybe look at segmenting the large QVD into smaller (but still large) chunks of data, maybe 10million rows each?

This could be useful when loading it back in as you might be able to select which files to load instead of loading the single huge QVD and filtering that for the data you want.