Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026! Turn data into bold moves, April 13 -15: Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tMap Out Of Memory with Small Look up table?

I have 200K rows (15 columns) coming from HBase in my Main row, and only 6K rows (5 columns) from HBase in my Lookup row, yet I'm getting an Out of Memory error in my job. I was under the impression that only the 6K rows from my lookup would be stored in memory. Is this correct?
I have Join Model set to Inner Join, Match Model set to Unique match, Lookup model set to Load Once, and Store Temp data set to false. I have two output rows, one being Catch Inner Join Rejects.
I know that the common advice in this situation is to set Store Temp Data to true, but I'm confused as to why that is necessary given that I have only 6000 rows (with 5 columns) as my lookup. Moreover, I did try it out, and I still received this error.
Moreover, in the previous job, I pull in the 200K rows from HBase without any memory errors. The big difference is that I'm not using a tMap in the previous job, whereas I am in this job.
Does setting the Match Model to Unique Match cause an increased burden on the memory relative to All Matches? My join key is guaranteed to be 1:1, so Unique Match or All Matches would make no difference in my output.
Labels (2)
1 Reply
Anonymous
Not applicable
Author

Hi
All the data (main and lookup data ) will be read in memory by default. For OutOfMemory error, we usually need to optimize the job design and allocate more memory to the job execution, please refers to this KB article, let me know if you still have the problem.
Regards
Shong