Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
SND
Contributor
Contributor

"GC overhead limit exceeded" error when dealing with two versions of Mysql in the same job

Hello everyone,

I am trying to transfer data from a MySql 5.6.48 database table (tDBInput_1) to a Mysql 8.0.22 database table (tMysqlOutput_1) using talend open studio 7.3.1 but I get the error below :

Exception in component tDBInput_1 (ExtractAndLoadFromWSRecipient___Release)

java.sql.SQLException: GC overhead limit exceeded

at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:129)

at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)

at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)

at com.mysql.cj.jdbc.StatementImpl.executeQuery(StatementImpl.java:1200)

....

( While I allocated enough memory to the job (-Xms3072M; -Xmx4096M) and activated the "enable stream" option...)

It is as if the tDBInput_1 having to use the driver for mysql 5 (com.mysql.jdbc.Driver) uses that of mysql 8 (com.mysql.cj.jdbc.Driver) as indicated in the error output (I don't know if that means anything...)

Ps :

-> In

the job execution beginning , i have this warning:

loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.

-> mysql-connector-java-5.1.30-bin and mysql-connector-java-8.0.18 are the main connectors in lib directory after building the job

-> I was using the same job options with a mariadb database version, to transfer data to a Mariadb database table, and everything was working without errors

Thank you in advance for help.

Regards;

Labels (2)
13 Replies
SND
Contributor
Contributor
Author

Hello,

 

Thank you for assistance.

 

So i redesigned my job after making a downgrade to MySql 5 just for a while as far as i had deadlines, but below is the case :

 

i have two databases A (MySQL 5) and B (MySQL 😎 and i want to migrate data from table A.ta (with over 10 millions of rows ) to table B.tb ... so i used tMysqlInput and tMysqlOuput components, with enough memory and "enable stream" option activated. 

 

After executing the job , i got the warning first and then the error .

 

Thank you in advance.

 

Anonymous
Not applicable

Hello,

Sorry for our delay. I was trying to find out if this "GC overhead limit exceeded" error is caused by two versions of Mysql in the same job.

I made a testing job with MySql5 and MySql8 in a same job and get the same warning that "Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary."

Fortunately, it doesn't block job execution and I'm able to migrate my data from Mysql5 to Mysql 8 successfully without deleting old driver(mysql5).

From above scenario, the error warning is not the root cause and "GC overhead limit exceeded" error is probably caused by your migrated data volume, memory and so on.

It would be better that you could post your job design screenshots here which will helpful for us to get more details from your current situation.

 

0695b00000QF28kAAD.pngBest regards

Sabrina

SND
Contributor
Contributor
Author

Hello Sabrina, sorry for the late reply

 

Thank you for your assistance.

 

Indeed it might be caused by my data volume (over 11 millions rows) , for the memory i allocated to the job a max of 4G (-Xms3072M; -Xmx4096M) and activated the "enable stream" option to avoid the GC overhead limit exceeded.

 

It's like the "enable stream" option does not work if two different versions of MySql are used in the same job.

 

I have tested the same job using two close MySql versions (5.6 for the source and 5.7 for the destination) and it works like a charm ... 😞

 

Below are some screenshots :

 

0695b00000Ri7fMAAR.pngthe selected rows from billing table are over 11 millions in number, and for the recipient table it is about the same number of rows (and at the end of the job i am inserting inner join rejects from the tmap in the output "recipient" table.

 

Hope it will be helpful.

 

Best regards.

Anonymous
Not applicable

Hello,

That might be caused by DB driver when using the streaming mode.

You are able to get around it and still use the stream, is to set the DB Version to mysql5 not mysql 8. Or without using the streaming mode. Without streaming the driver caches, you can only try to reduce the data.

Feel free to raise a jira issue on talend bug tracker.

Best regards

Sabrina