Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
mmkhan12
Partner - Contributor II
Partner - Contributor II

How to Handel Large data set while loading in qlikview.

Hi All,


We are facing difficulty in loading enormous data in qlikview.


In Our Data model there is one Large data table with ~15 Million rows added each day and 480~500 million rows per month (41 fields).

and rest of the table contains ~1 Million rows. All together  501 Million records.


Time to load from database is around > 3 hours

I even store same data in QVD and load all the data from QVD's and took me around 2 hours to complete the load.

  1. 1.       How to handle large data set considering the numbers of rows as said above.
  2. 2.       Procedure to load this type of data set
  3. 3.       Frequency of load time 15 or 30 Minutes
  4. 4.       Any third party integration is required to handle such enormous data.
  5. 5.       Infrastructure requirement to handle such data sets
  • Ram Size
  • Disk Size
  • CPU

Your assistance will be much appreciated.

Mustafa

2 Replies
Digvijay_Singh

Or
MVP
MVP

One important note - if loading from QVD takes that long compared to loading from DB, something is not right with the world. Reading from the properly-prepared QVD should be significantly quicker than you describe, even with a large number of rows, assuming your server has enough RAM to handle the data. If it is going as slowly as you describe, there are two options:

A. You are not correctly using the QVD process:

1) Load your data from the database.

2) Make any modifications to the data necessary. In particular, any elimination of data (WHERE conditions) or grouping (GROUP BY) should happen here.

3) Save the fully-prepared data to QVD.

4) When reading the QVD, do so without adding any new filtering/grouping conditions.

Note that it may be better for you to use incremental reload techniques, save each new increment of data into a new QVD file, and when loading, use a script to loop through all these files and read them one at a time.

B. Your machine does not have enough RAM to handle the data volume, and the additional time is being used for swap-writing to your hard drive. You should easily be able to detect this by watching the memory usage during loading - if it's constantly at or close to 100%, this is likely your problem. If this is the case, try adding RAM. Switching to faster (SSD) disks might also help to some extent, and will also reduce some of the time required to save your document.