Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

GC out of memory error

Hi,

 

I created multiple outputs and after run I got out of memory error. Actually I have configured my JVM Argument to 1024-9162 . I hope I can find permanent solution for this error. Another thing is when I run multiple outputs in one job it will be always super slow. 

 

Thanks0683p000009M3UY.jpg

Labels (2)
6 Replies
Anonymous
Not applicable
Author

@szhou1 

 

Make sure you go into the Run Window in the Designer View and change your Advance Settings (JVM Settings)based on your memory configuration on your machine. Default is -Xms256M -Xmx1024M. I have 8GB of RAM on my machine so my settings are in the ss below.

 


0683p000009M46V.png

Anonymous
Not applicable
Author

You can find the right values on line if you search for JVM Settings for Talend based on Memory. You can also Split the Input File into multiple files. If you're using a Lookup make sure you're only bringing in the Lookup values you need. Make sure you filter any input data before bringing it into a tMap component.

 

 

You can store data into a tHashOutput ( in-memory table) if you're lookup is quite large. 

 

Let me know if this helps

 

Andrew

Anonymous
Not applicable
Author

@szhou1 May i know how many rows you have in the source, and how many in the look ups ?

JBristow
Creator
Creator

  • The issue can be addressed as many have responded with JVM settings appropriate for the job. This can be done in the job itself while running in the Studio by selecting the "Run" tab and then applying the appropriate memory allocation - and you can always apply those settings in the TAC when deploying the job.

 

I've also had to consider memory issues when designing my data flow - based on volume of data being processed. I have one job that I was forced to split the input file into smaller units as a primary step - and then iterate through those smaller units. 

 

So basically - there's no "one solution" to this - because it depends on a number of factors; but you have options to manage it.

 

Good luck!

Anonymous
Not applicable
Author

Main is around 200,000.
Lookup is around 400,000

Thanks
Anonymous
Not applicable
Author

1) verify the RAM in the TAC server it is important, as while running on TAC you may run multiple jobs so, not to get effected, check it

2)Another option is at tMap you can generate intermediate files like below link you can follow form the file as input

https://community.talend.com/t5/Design-and-Development/GC-overhead-limit-error-when-running-Jobs-in-...

3) if you use tHashcomponent it can consume more memory with in the job

 

 

this memory isssues can come in many formats.