Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Soumya_M
Contributor II
Contributor II

[resolved] java.lang.OutOfMemoryError: Java heap space

Hi all,

My talend job looks like - tPostgresqlInput - tMap - tPostgresqlOutput

the input table has 6mn+ records, I need to extract 2 columns from input table and send it to the output table

when i run the job, I'm getting error like -

Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "Thread-0"

Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

Even though I've set JVM settings for -Xms and Xmx as 4096m and 8192m respectively

Can anyone please help me in rectifying this error

Labels (3)
1 Solution

Accepted Solutions
Anonymous
Not applicable

Hello,

Please have a look at this KB article about:

https://community.talend.com/s/article/OutOfMemory-Exception-WmtmQ

Hope it will be helpful for your use case.

Best regards

Sabrina

View solution in original post

6 Replies
Anonymous
Not applicable

Hello,

Please have a look at this KB article about:

https://community.talend.com/s/article/OutOfMemory-Exception-WmtmQ

Hope it will be helpful for your use case.

Best regards

Sabrina

Amanda2569
Contributor
Contributor

The solution worked for me thanks to the community and the members for the solution.

 

Anonymous
Not applicable

Hello,

Great it helps.

Feel free to let us know if there is any further help we can give.

Best regards

Sabrina

Soumya_M
Contributor II
Contributor II
Author

Thank you!!

that was helpful 🙂

Anonymous
Not applicable

Hello,

 

I've seen this error many times when using existing connection with Auto commit enabled.

Try the following:

  • Make sure fetch size is enabled under the advanced settings.
  • Use built in connection or existing connection without auto commit.
Jackson0123
Contributor
Contributor

The java.lang.OutOfMemoryError: Java heap space error actually tells you that your Talend job that you have explained is trying to use more memory than the JVM heap can provide even though you have set higher value, -Xms4096m and -Xmx8192m. This often happens in data integration jobs processing large datasets, like your case with 6 million+ records, that also if components like tMap are configured to load entire datasets into memory.

I have provided you a few tuning and steps to follow that might help you resolve the OutOf MemoryError you are facing, 

Try enabling row-by-row processing. Wherever possible try to enable it instead of loading all data at once. In tMap, disable the option “Load lookup flow before starting” unless necessary.

If possible, process data in smaller chunks using limit and offset queries in your tPostgresqlInput.

Review the buffer size and memory settings in Talend Studio preferences. Increase the JVM memory there too if required.

Ensure unnecessary columns are not selected or carried through your flow — only fetch and map what’s strictly needed.

If you’re using lookup tables, switch from load once to reload at each row if lookup size is large.

Lastly, monitor the job execution via a memory profiler or Talend’s built-in monitoring tools to pinpoint any component causing excessive memory use.


Implementing these few changes might help you prevent the OutOfMemoryError. Hope this also will allow your job to run smoothly with large datasets.