Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Team,
I need a small help to resolve the below issue. Actually, I am doing an incremental load by changing the few fields in tmap.
This is a large dataset.
Please find the attached job screenshot.
Exception in thread "main" java.lang.OutOfMemoryError: GC overhead limit exceeded
at com.mysql.jdbc.MysqlIO.nextRowFast(MysqlIO.java:2267)
at com.mysql.jdbc.MysqlIO.nextRow(MysqlIO.java:2044)
at com.mysql.jdbc.MysqlIO.readSingleRowSet(MysqlIO.java:3549)
at com.mysql.jdbc.MysqlIO.getResultSet(MysqlIO.java:489)
at com.mysql.jdbc.MysqlIO.readResultsForQueryOrUpdate(MysqlIO.java:3240)
at com.mysql.jdbc.MysqlIO.readAllResults(MysqlIO.java:2411)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2834)
Regards
Jay
For sample data, it is working fine. Can someone guide me why it's giving this error for large dataset?
Hello,
Have you tried to store your data on disk instead of memory when you are using tMap component in your work flow?
This exception means that the Job ran out of memory.
If there is a job that uses a large amount of memory, please try to set JVM parameters to your job.
For more information, please have a look at related documents:TalendHelpCenter:How to set advanced execution settings,
Best regards
Sabrina
Hi Sabrina,
Thanks for the reply. I tried the same and its loading now.
I have a small question on "Have you tried to store your data on disk instead of memory when you are using tMap component in your workflow?"
I directly connected to the database and retrieved the schema. can you please tell me how to store this data on disk in talend.
Could you please elaborate this.