Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I created multiple outputs and after run I got out of memory error. Actually I have configured my JVM Argument to 1024-9162 . I hope I can find permanent solution for this error. Another thing is when I run multiple outputs in one job it will be always super slow.
Thanks
Make sure you go into the Run Window in the Designer View and change your Advance Settings (JVM Settings)based on your memory configuration on your machine. Default is -Xms256M -Xmx1024M. I have 8GB of RAM on my machine so my settings are in the ss below.
You can find the right values on line if you search for JVM Settings for Talend based on Memory. You can also Split the Input File into multiple files. If you're using a Lookup make sure you're only bringing in the Lookup values you need. Make sure you filter any input data before bringing it into a tMap component.
You can store data into a tHashOutput ( in-memory table) if you're lookup is quite large.
Let me know if this helps
Andrew
@szhou1 May i know how many rows you have in the source, and how many in the look ups ?
I've also had to consider memory issues when designing my data flow - based on volume of data being processed. I have one job that I was forced to split the input file into smaller units as a primary step - and then iterate through those smaller units.
So basically - there's no "one solution" to this - because it depends on a number of factors; but you have options to manage it.
Good luck!
1) verify the RAM in the TAC server it is important, as while running on TAC you may run multiple jobs so, not to get effected, check it
2)Another option is at tMap you can generate intermediate files like below link you can follow form the file as input
3) if you use tHashcomponent it can consume more memory with in the job
this memory isssues can come in many formats.