Skip to main content
Announcements
A fresh, new look for the Data Integration & Quality forums and navigation! Read more about what's changed.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

java.lang.OutOfMemoryError: GC overhead limit exceeded

Hi I am trying to load data from sybase input in which in select query i am doing join and fetching around 6 million records and using tmap trying to load it in sybase table using tsybase output.I am not able to load and it getting struck with just 1 row and then when i tried to increase batch size then it go till 1lakh records and giving error.

Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.lang.StringCoding$StringDecoder.decode(Unknown Source)
at java.lang.StringCoding.decode(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at com.sybase.jdbc3.utils.PureConverter.toUnicode(Unknown Source)

 

Please suggest

Labels (4)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

@rahuljan 

 

Have you selected the Die on Error condition in your tSybaseOutput component?

 

The batch size is used for committing the records and if you are not providing the value, it will take the default commit size.

 

In your case, why are you not using the Bulk components to load the data considering your huge data volume?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

View solution in original post

6 Replies
manodwhb
Champion II

@rahuljan ,you have JVM issue,sicne you need to increase a JVM properties. check the below link

 

https://community.talend.com/t5/Design-and-Development/resolved-OutOfMemoryError-GC-overhead-limit-e...

Anonymous
Not applicable
Author

Hi,

 

    2 cents from my end will be to use the disk space to store the temporary data in tMap.

 

    Please refer the below link for details.

 

https://help.talend.com/reader/EJfmjmfWqXUp5sadUwoGBA/J4xg5kxhK1afr7i7rFA65w

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

Anonymous
Not applicable
Author

@nthampi @manodwhb Even after using store on disk property it pass 10000 row and stop the process. 100000 is default batch size set in tsybase output.Please suggest

manodwhb
Champion II

@rahuljan ,could you please let me know what parameters are you using -jms and -Jmx

Anonymous
Not applicable
Author

@manodwhb  @nthampi  i am using xms256m and xmx1024. The real problem is when i am enabling batch size in output component as 10000 the job stopped working after reading 10000 recrords. and if i disabled it it start working till 1 lakh records. Why batch size is restrincing records after 10 K any idea?

Anonymous
Not applicable
Author

@rahuljan 

 

Have you selected the Die on Error condition in your tSybaseOutput component?

 

The batch size is used for committing the records and if you are not providing the value, it will take the default commit size.

 

In your case, why are you not using the Bulk components to load the data considering your huge data volume?

 

Warm Regards,
Nikhil Thampi

Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂