Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

java.lang.OutOfMemoryError: GC overhead limit exceeded

Hi,

 

I got out of memory error when I run my job. my inputs have over 1 million data and I have set JVM setting to Xms1024m-Xmx8192m but it still shows out of memory.

Can you please advise me how I can deal with this error?

Thanks a lot!

Capture.JPGCapture.JPG

Labels (2)
4 Replies
Anonymous
Not applicable
Author

Hello,

At a first step, could you please let us know if this article helps for you?

https://community.talend.com/t5/Migration-Configuration-and/GC-overhead-limit-error-when-running-Job...

Best regards

Sabrina

Anonymous
Not applicable
Author

Hi,

 

I added a flag as the article said but I still got "out of memory" error. 

One source Workbook has more than 1million data. Another one ISV_US has about 0.5 million data

Capture.JPGCapture.JPGCapture.JPG

Anonymous
Not applicable
Author

Hello,

With your tMap component, are you using temp directory storage to improve the performance?

For a large set of data, try to store the data on disk instead of memory.

Best regards

Sabrina

 

 

David_Beaty
Specialist
Specialist

Hi,

 

1/ Use Temp data store as xdshi suggests

2/ In the lookup config of the tMap (spanner icon on the lookup flow in tMap), set "Store Temp Data" to true

3/ Consider swapping the main/lookup around so the smaller dataset is the lookup and larger is the main.

 

Thanks

 

David