Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
talendsai
Contributor
Contributor

Getting java.lang.OutOfMemoryError: Java heap space error

Hi All,
I am getting java.lang.OutOfMemoryError: Java heap space error when i run below job.

My job is only reading data from Oracle database and loading data to Redshift without any transformations but it is moving 26 million records from Oracle to Redshift, job is aborting after moving 5 million with heap space error.
I have used JVM arguments -Xmx1024M & -Xms256M as shown in the above screenshot but still getting same error.
Our Runtime and TAC servers are 64 bit Ubuntu OS with 8 GB memory.
Can someone explain me reason for heap space error as i am not using any transformation or other components that buffers data in to the memory and solutions to fix this issue.
Here is some more info on redshift configuration

Thanks & Regards,
SaiSridhar N
Labels (4)
3 Replies
Anonymous
Not applicable

Hi,
Thank you for your post! We can't see the screenshot on our side. Could you attach it on the forum, please? That would be great.
Would you mind posting your current job design screenshots into forum?
Best regards
Sabrina
talendsai
Contributor
Contributor
Author

Hi Sabrina,
Attached are the screenshots of Job design and configurations.
After increasing the JVM size and using cursor option on source database, job loaded around 11 million and failed again.
0683p000009MBrw.png 0683p000009MBs1.png 0683p000009MBrY.png 0683p000009MBs6.png
Anonymous
Not applicable

Hi,
Sorry for delay!
Could you please indicate the build version you are using?
Best regards
Sabrina