Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have an OutOfMemoryError exception when I'm processing on a table with 1000000 rows. Could you help me ?
The configuration of my job is: tOracleInput-> tMap-> tOracleOutput
I am new on TOS and sorry for my English because I am French.
Thank you
Guen
There will lot many reasons for memory issues, increase a -xmx to 4g or more then that, according to RAM
Refer to below links for more info
You have a couple of choices of how to solve this. The first is to focus on the memory of the tMap (https://help.talend.com/reader/EJfmjmfWqXUp5sadUwoGBA/J4xg5kxhK1afr7i7rFA65w). The second way is to adjust the memory assigned to the job (https://help.talend.com/reader/93olCfmQi615MRwYBjy30g/chVspuZjtzXb9Q9HRZKjtw).
Hello @ rhall_2_0 thank you for your help,
I opted for the first choice, I focused on the memory of tMap and I did what is asked. Now I have this error. I do not know how to fix it.
This is strange. When you were having memory issues, I assume the job was running and got through a number of rows before it failed? Does the job run at all now, or is this error shown straight away? Also, could you show a screenshot of your job?
By the way, thanks for the screenshot of the errors, but it would be much easier to work with if you included errors as text. You can copy and paste from the output window 🙂
Yes, the job went through a certain number of lines before it failed. It blocks while loading "STG_OT". Here is the screenshot of my work. Thank you
Thank you
OK, but does it just error straight away now or does it error when it is loading data from SGT_OT? Also, can you run this again and copy (as text) the whole error stack?