Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 
aniruddhyadutta
Creator
Creator

Using partial load instead of incremental load

Hi All,

I have a table where I have designed incremental load. The table has a volume of around 10 million records.  I have an insert update logic running in my incremental load.So its picking up new/modified records and concatenating them into the history qvd every 5 mins. In this way my QVD file is growing in size every hour.

Once incremental load finishes it tigers my dashboard. Now the problem is every time my dash board runs it has to read through all the days of data i.e the history data in the qvd along with the new ones that came in.Its taking a huge time which business can't afford.

Can anyone suggest me some otimization solution?

while researching I found we can do partial reload but some how I cannot releate to it since all the examples are talking  about reading a new table. But what if I can use the partial reload option in my application so that it don't have to read the records it has already stored gone through?

I am using qliksense 3.2. (I think partial reload option is not native but that's a different question)

3 Replies
rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

Can you post the script you are using to load the QVDs in your final dashboard?  Using an optimized load, loading 10M rows from multiple QVDs shouldn't take very long.

-Rob

http://masterssummit.com

http://qlikviewcookbook.com

aniruddhyadutta
Creator
Creator
Author

Hi Rob,

Thanks a lot for finding out time to reply.


I am attaching the script log. I have issue in a certain area where I am doing a group by  . I have created a calendar table to built interval for 5 min,15 min and 60 min and then join the intervals with my transaction data which falls under them .I have removed some irrelevant  part from the script and also changed the business column name for confidentiality but you should get the general data flow.


Please let me know your suggestion how can I optimize. Also I am using incremental load to contact new/updated data in the transaction QVD and reading the entire qvd in my application at each run.  But if any other approach is more efficient like binary load  when looking at a application to handle 15-20 m records please let me know.

rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

Group By can be slow.  Because you have inserts & updates and I assume have to redo the group by over the entire set, I don't see any quick solutions.

-Rob