Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tDenormalize cause OutOfMemory in combination with large files.

Hello All,
There is a DI Job created by me and it loads 4 million rows from a ascii text. Then these data flow over to tMap and afterwards flow further to component tDenormalize. The memory consumption for this job is extraordinary high. I have created a java heap dump file and I can see one culprit of others. tDenormalize component respectivley its objects resident in a hugh amount inside the heap. Has someone any idea to avoid the use of tDenormalize or make it more efficiency? The amounts of HashMap objects are also very high and these ones take actually the most of the heap.
0683p000009MAt1.jpg 0683p000009MAxj.jpg
Is it true Talend requires no savings in memory but instead insisted of limitless memory disposition?
Kind regards
Hilderich
Labels (3)
2 Replies
chafer
Contributor
Contributor

I know that if the "lookup" files are large, like thousands of rows, those values will be placed into memory. You may have done this but if not, increase the JVM memory of the job.  This can be changed on the "Run" tab, on left select 'Advanced Settings" and check "Use Specific JVM arguments" and double-click each to increase the memory allowed to be used.  The Max memory is the bottom one and I usually limit this to 4GB like "-Xmx4096"     
0683p000009MA09.png
Anonymous
Not applicable
Author

Hi,
Have you checked KB article TalendHelpCenter:ExceptionOutOfMemory
Best regards
Sabrina