Qlik Community

App Development

Discussion board where members can learn more about Qlik Sense App Development and Usage.

Announcements
WE ARE LISTENING! New Navigation for Qlik Community, Sept. 26: TELL ME MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
leonaschaaf
Contributor III
Contributor III

concatenate QVD with Snowflake data - memory error

I have a QlikSense app which needs to load a massive amount of data.
Originally the data  was all coming from Snowflake. I decided to create a QVD from one of the tables which holds static data. To make the app load faster.

So first I load the QVD and then I concatenate the "new" Snowflake data. They don't have the same number of columns but that always worked fine when I just used the Snowflake tables and concatenated those.

But when I do it now it's ginving me the "famous"  -129 memory issue. 
I don't get why this happens.  When I run this in QlikView it works without issues.

Why is this happening? 

Even if I try creating a temp table out of the snowflake table first and then do a concatenate I'm still getting the error.
Any suggestions/solutions?

Labels (1)
3 Replies
marcus_sommer
MVP & Luminary
MVP & Luminary

I suggest to look into the document-logs to see where it breaks and how many records were loaded. Maybe some filter didn't work like expected or tables were not dropped correctly and the system runs now out of RAM and/or it creates afterwards multiple synthetic keys or similar stuff.

- Marcus

leonaschaaf
Contributor III
Contributor III
Author

It breaks after it loaded the columns from my Snowflake table. So it seems to break on the concatenate part.
I'm trying to first load the data and store that in a QVD as well to see if that helps. Not how I want it to work (since that takes extra time) but if this works it's already a whole lot faster then loading everything immediately from Snowflake

marcus_sommer
MVP & Luminary
MVP & Luminary

Just with these information it's difficult to guess what happens but the most likely is further that you really run out of RAM. Did you monitor RAM + CPU during the load?

Beside this are you sure that your incremental approach worked like expected? Especially by querying data-bases it could become difficult because Qlik doesn't execute any SQL else just transferred the statement per driver to the data-base and returned on this way back the data. This means for a real incremental approach must the where-clause from the SQL already restrict the data to the new/changed ones. Any filtering within the Qlik loadings means that before all records needs to be transferred.

- Marcus