
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
java.lang.OutOfMemoryError: GC overhead limit exceeded
Hi I am trying to load data from sybase input in which in select query i am doing join and fetching around 6 million records and using tmap trying to load it in sybase table using tsybase output.I am not able to load and it getting struck with just 1 row and then when i tried to increase batch size then it go till 1lakh records and giving error.
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.lang.StringCoding$StringDecoder.decode(Unknown Source)
at java.lang.StringCoding.decode(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at com.sybase.jdbc3.utils.PureConverter.toUnicode(Unknown Source)
Please suggest
Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you selected the Die on Error condition in your tSybaseOutput component?
The batch size is used for committing the records and if you are not providing the value, it will take the default commit size.
In your case, why are you not using the Bulk components to load the data considering your huge data volume?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@rahuljan ,you have JVM issue,sicne you need to increase a JVM properties. check the below link

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
2 cents from my end will be to use the disk space to store the temporary data in tMap.
Please refer the below link for details.
https://help.talend.com/reader/EJfmjmfWqXUp5sadUwoGBA/J4xg5kxhK1afr7i7rFA65w
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@rahuljan ,could you please let me know what parameters are you using -jms and -Jmx

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@manodwhb @nthampi i am using xms256m and xmx1024. The real problem is when i am enabling batch size in output component as 10000 the job stopped working after reading 10000 recrords. and if i disabled it it start working till 1 lakh records. Why batch size is restrincing records after 10 K any idea?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Have you selected the Die on Error condition in your tSybaseOutput component?
The batch size is used for committing the records and if you are not providing the value, it will take the default commit size.
In your case, why are you not using the Bulk components to load the data considering your huge data volume?
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
