Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

GC Overhead limit exceeded on server

Hi,

I am loading around 7.5 million records from a db and after transformation the size of the records doubles to 15 million. But during execution my job got failed with Error "GC Overhead Limit exceeded on server". I am putting xmx - 10240 MB and xms - 1024 MB. Please find my job design below.

 

tmssqlinput-->tJavaRow-->tExtractJsonFields --> tMap --> tDenormalize --> tJavaRow --> tMSSQLOutput

 

At tExtractJSONFields data is getting double and at tDenormalize I am merging 2 records to One. I am getting this error at tDenormalize. Is there any better solution to this flow because tDenormalize is holding complete and then passing it one by one.

 

 

Best Regards,

 

Abhishek

 

Labels (4)
2 Replies
Anonymous
Not applicable
Author

Hello,

Could you please try with 4096 or bigger if You have this memory free on Talend Machine?

Best regards

Sabrina

 

Anonymous
Not applicable
Author

Hi xdishi,
Xmx is already 10240 MB, almost twice as that of 4096.
Or should I increase the xms value ? But I think increasing the memory is not a permanent solution because in our case records count can increase to 40 - 50 millions. In that case we need to again increase the memory.
What I am looking for is a permanent solution. Can we process the records in batches. I mean instead of reading all the records in one go, can we do it in batches.

Best Regards,

Abhishek