Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
leonaschaaf
Contributor III
Contributor III

concatenate QVD with Snowflake data - memory error

I have a QlikSense app which needs to load a massive amount of data.
Originally the data  was all coming from Snowflake. I decided to create a QVD from one of the tables which holds static data. To make the app load faster.

So first I load the QVD and then I concatenate the "new" Snowflake data. They don't have the same number of columns but that always worked fine when I just used the Snowflake tables and concatenated those.

But when I do it now it's ginving me the "famous"  -129 memory issue. 
I don't get why this happens.  When I run this in QlikView it works without issues.

Why is this happening? 

Even if I try creating a temp table out of the snowflake table first and then do a concatenate I'm still getting the error.
Any suggestions/solutions?

Labels (1)
3 Replies
marcus_sommer

I suggest to look into the document-logs to see where it breaks and how many records were loaded. Maybe some filter didn't work like expected or tables were not dropped correctly and the system runs now out of RAM and/or it creates afterwards multiple synthetic keys or similar stuff.

- Marcus

leonaschaaf
Contributor III
Contributor III
Author

It breaks after it loaded the columns from my Snowflake table. So it seems to break on the concatenate part.
I'm trying to first load the data and store that in a QVD as well to see if that helps. Not how I want it to work (since that takes extra time) but if this works it's already a whole lot faster then loading everything immediately from Snowflake

marcus_sommer

Just with these information it's difficult to guess what happens but the most likely is further that you really run out of RAM. Did you monitor RAM + CPU during the load?

Beside this are you sure that your incremental approach worked like expected? Especially by querying data-bases it could become difficult because Qlik doesn't execute any SQL else just transferred the statement per driver to the data-base and returned on this way back the data. This means for a real incremental approach must the where-clause from the SQL already restrict the data to the new/changed ones. Any filtering within the Qlik loadings means that before all records needs to be transferred.

- Marcus