Qlik Community

Ask a Question

App Development

Discussion board where members can learn more about Qlik Sense App Development and Usage.

April 22, 2PM EST: Learn about GeoOperations in Qlik Sense SaaS READ MORE
Showing results for 
Search instead for 
Did you mean: 

Store new rows into QVD without loading QVD? (Qlik Sense)


I am wondering if, in Qlik Sense, there is a way to store new rows into a qvd without needing to load the qvd.

For example:

  • I am doing an incremental load off of a date and there are 100 new rows of data.
  • The qvd that I need to store those 100 new rows into contains over 60 million rows

Is there a way to store those 100 new rows into the qvd with 60 million rows without having to load the 60 million rows?


Current Script

Resident Table1
Where IsNull(deletedDateTime) // Only loading rows that do not have a deleted timestamp

Drop Table table1;


// Qvd with over 60 million rows
Concatenate (Incremental_Table)
FROM [lib://QlikData/Fact.qvd] (qvd);


Store Incremental_Table into [lib://QlikData/Fact.qvd] (qvd);

Drop TableIncremental_Table;


Thank you!

Labels (3)
3 Replies
Creator II
Creator II


Sorry for my english.

Maybe helpful.


this article is in qlik view, but you can do the same for qlik sense without any problems.



Thanks for the article.  I am currently doing an incremental load. 

In the example below, lines 34-41, they are concatenating the existing qvd to the new rows.  My issue is, it is a large qvd (over 60 million rows) and I was wondering if there was a way to store the new rows into the qvd without having to load the existing qvd into the app.  I think the answer is no, but figured I would ask.



MVP & Luminary
MVP & Luminary

It's not possible just to append any records to a qvd without loading them because Qlik doesn't use a linear row-storing else it's a column-based storing-format which only stored the distinct values of each fields and creates a bit-stuffed pointer to link the values with the records. This logic/structure needs to re-created by each change.

Usually this isn't a problem because an optimized load from a qvd is very fast even by large datsets because no extra/new processing is needed to load the data else they are just transferred from the storage/network into the RAM - means it depends completely from their performance how long it takes.

Beside this I suggest to reverse the order within your incremental approach and loading at first the larger historical qvd-data and then adding the new records.

- Marcus