Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

[resolved] OutOfMemoryError: GC overhead limit exceeded in simple MSSqBulkExec jb

I'm doing a simple, three component job - tFileInputExcel > tMap > MSSqBulkOutputExec.
The input file has just 11,923 rows (was just written by another TOSDI job) and the tMap has no processing except some row mapping.
The MSSqBulkOutputExec uses a just-retrieved repository definition for the database table and is set to append to the SQL table.
Execution starts out slow and slows to a crawl until it stops at row 5,287 with the following: (The temporary file, mssql_data.txt finishes with 5,206 rows written.)
Exception in thread "main" java.lang.Error: java.lang.OutOfMemoryError: GC overhead limit exceeded
at masterproviderdatabase.dhpl_all_insertlicenseesintompdproviders_0_1.DHPL_All_InsertLicenseesIntoMPDProviders.tFileInputExcel_1Process(DHPL_All_InsertLicenseesIntoMPDProviders.java:3830)
at masterproviderdatabase.dhpl_all_insertlicenseesintompdproviders_0_1.DHPL_All_InsertLicenseesIntoMPDProviders.runJobInTOS(DHPL_All_InsertLicenseesIntoMPDProviders.java:4012)
at masterproviderdatabase.dhpl_all_insertlicenseesintompdproviders_0_1.DHPL_All_InsertLicenseesIntoMPDProviders.main(DHPL_All_InsertLicenseesIntoMPDProviders.java:3877)
Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
at sun.reflect.GeneratedConstructorAccessor6.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:525)
at org.apache.xmlbeans.impl.schema.SchemaTypeImpl.createUnattachedNode(SchemaTypeImpl.java:1859)
disconnected
at org.apache.xmlbeans.impl.schema.SchemaTypeImpl.createElementType(SchemaTypeImpl.java:1021)
at org.apache.xmlbeans.impl.values.XmlObjectBase.create_element_user(XmlObjectBase.java:893)
at org.apache.xmlbeans.impl.store.Xobj.getUser(Xobj.java:1657)
at org.apache.xmlbeans.impl.store.Xobj.find_element_user(Xobj.java:2062)
at org.openxmlformats.schemas.spreadsheetml.x2006.main.impl.CTCellImpl.getIs(Unknown Source)
at org.apache.poi.xssf.usermodel.XSSFCell.getRichStringCellValue(XSSFCell.java:269)
at org.apache.poi.xssf.usermodel.XSSFCell.getRichStringCellValue(XSSFCell.java:64)
at masterproviderdatabase.dhpl_all_insertlicenseesintompdproviders_0_1.DHPL_All_InsertLicenseesIntoMPDProviders.tFileInputExcel_1Process(DHPL_All_InsertLicenseesIntoMPDProviders.java:2457)
... 2 more
I've done much bigger SQL outputs that this. I tied restarting Windows(7) to clear any cobwebs but no difference.
Any suggestions? Thanks!
UPDATE: I just replaced the MSSqBulkOutputExec then the tMap with a tLogRow, disconnecting them until I only have the FileInputExcel and the tLogRow - it still fails the same! ??
UPDATE: On the Advanced Settings tab of the FileInputExcel I found the Generation Mode field and set it to "Less memory consumed . . . " All set now.
Labels (6)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

hi all,
be aware that JVM param doesn't correct the reason why you've got probably a too smal heap for your app.
you just disable the limit of 98% accepted for GC. It could solve your problem but for a while.
http://javaeesupportpatterns.blogspot.fr/2011/08/gc-overhead-limit-exceeded-problem-and.html
regards
laurent

View solution in original post

25 Replies
Anonymous
Not applicable
Author

Hi,
Would you mind sharing your resolution with us? All of us are interested with it, thanks a lot.
Best regards
Sabrina
soujanyam
Contributor
Contributor

Hi everyone,
I'm also facing same problem. When my input files increases at tFilelist in my job I'm getting error as
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
    at java.io.BufferedReader.<init>(BufferedReader.java:98)
    at java.io.BufferedReader.<init>(BufferedReader.java:109)
    at org.talend.fileprocess.TOSDelimitedReader.init(TOSDelimitedReader.java:109)
    at org.talend.fileprocess.TOSDelimitedReader.<init>(TOSDelimitedReader.java:90)
    at org.talend.fileprocess.FileInputDelimited.<init>(FileInputDelimited.java:167)
    at sample.rech_profile_0_1.Rech_profile.tFileList_4Process(Rech_profile.java:6771)
    at sample.rech_profile_0_1.Rech_profile.tHashInput_tUnite_1Process(Rech_profile.java:16932)
    at sample.rech_profile_0_1.Rech_profile.runJobInTOS(Rech_profile.java:18070)
    at sample.rech_profile_0_1.Rech_profile.main(Rech_profile.java:17953)
What should I do?Please provide the solution to me.Any help would be appreciated.
Thanks in advance.
Anonymous
Not applicable
Author

Try below option
Add new option to Windows-->Preferences-->Talend-->Run/Debug  - XX:-UseGCOverheadLimit
https://community.talend.com/t5/Design-and-Development/resolved-Solution-for-quot-java-lang-OutOfMem...
Thanks
Vaibhav
soujanyam
Contributor
Contributor

Thanks for your prompt reply vaibav.
I'll check it up right now. I've send an mail to you for your help.
Anonymous
Not applicable
Author

hi all,
be aware that JVM param doesn't correct the reason why you've got probably a too smal heap for your app.
you just disable the limit of 98% accepted for GC. It could solve your problem but for a while.
http://javaeesupportpatterns.blogspot.fr/2011/08/gc-overhead-limit-exceeded-problem-and.html
regards
laurent
soujanyam
Contributor
Contributor

Ya I tried as Vaibav said
But got error again like
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space

at sample.rech_profile_0_1.Rech_profile.tFileList_4Process(Rech_profile.java:7344)
at sample.rech_profile_0_1.Rech_profile.tHashInput_tUnite_1Process(Rech_profile.java:16932)
at sample.rech_profile_0_1.Rech_profile.runJobInTOS(Rech_profile.java:18070)
at sample.rech_profile_0_1.Rech_profile.main(Rech_profile.java:17953)

soujanyam
Contributor
Contributor

Thanks for your prompt reply laurent,
But provide the solution to me.
Anonymous
Not applicable
Author

Anonymous
Not applicable
Author

Hi sanvaibhav and holberger,
increasing the heap space is mostly not a good idea. As kzone already mentioned it helps only for a short while and is never a solution. You have to redesign your job to process less data at once and take care you do not read to much data into the memory.
Figure out in which way you can separate the amount of records to a number of bunch of records and iterate over these bunches.