Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Error : Exception in thread "main" java.lang.NoClassDefFoundError: com.mysql.jdbc.Driver Big Data

Hello all,

 

I have an error in one of my job. I usually scheduled my jobs and no further incidents occur. But when one of my job finished running, it returns an error saying this :

 

[FATAL]: bda_prod.bmstgics_main_initial_0_1.BMSTGICS_Main_Initial - tRunJob_2 Child job returns 1. It doesn't terminate normally.
Exception in thread "main" java.lang.NoClassDefFoundError: com.mysql.jdbc.Driver
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:264)
	at bda_prod.ansokrpf_to_hive_initial_0_1.ANSOKRPF_to_Hive_Initial.tMysqlConnection_1Process(ANSOKRPF_to_Hive_Initial.java:2005)
	at bda_prod.ansokrpf_to_hive_initial_0_1.ANSOKRPF_to_Hive_Initial.tJava_1Process(ANSOKRPF_to_Hive_Initial.java:1842)
	at bda_prod.ansokrpf_to_hive_initial_0_1.ANSOKRPF_to_Hive_Initial.runJobInTOS(ANSOKRPF_to_Hive_Initial.java:8640)
	at bda_prod.ansokrpf_to_hive_initial_0_1.ANSOKRPF_to_Hive_Initial.runJob(ANSOKRPF_to_Hive_Initial.java:7926)
	at bda_prod.ansokrpf_main_initial_0_1.ANSOKRPF_Main_Initial.tRunJob_1Process(ANSOKRPF_Main_Initial.java:2741)
	at bda_prod.ansokrpf_main_initial_0_1.ANSOKRPF_Main_Initial.tWaitForSqlData_1Process(ANSOKRPF_Main_Initial.java:2208)
	at bda_prod.ansokrpf_main_initial_0_1.ANSOKRPF_Main_Initial.tOracleConnection_1Process(ANSOKRPF_Main_Initial.java:1917)
	at bda_prod.ansokrpf_main_initial_0_1.ANSOKRPF_Main_Initial.runJobInTOS(ANSOKRPF_Main_Initial.java:7102)
	at bda_prod.ansokrpf_main_initial_0_1.ANSOKRPF_Main_Initial.main(ANSOKRPF_Main_Initial.java:6385)

[ERROR]: bda_prod.bmstgics_main_initial_0_1.BMSTGICS_Main_Initial - tParallelize_1 - null

The schema had several tables, to be exact there are 21 tables in this job. It returns only 20 tables into Hadoop and 1 table caught an error.

 

Please do so kindly to help with this problem, and feel free to ask any questions regarding the error above.

 

Thank you.

Regards,

Sulaiman

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hello,

The best place to start is this website - https://login.talend.com/support-login.php.

Please refer to Talend Support Guide and let us know if it is OK with you.

0683p000009Lym5.png

 

Best regards

Sabrina

View solution in original post

11 Replies
Anonymous
Not applicable
Author

Hello,

Are you using Dynamic Job functionality in your tRunJob component? Does your job work well in studio? More information about your job setting will be preferred.

Best regards

Sabrina

Anonymous
Not applicable
Author

Hello Sabrina,

 

Thank you for your help, I use a normal functioning job on my tRunJob component. Usually my job works well and can extract data from Oracle. Some of my jobs consists of 33 tables of data that needs to past 3 layers. Hence, staging layer, persistence layer, and archive layer (Usually it has no data because there are no data that needs to be archived).

 

Further inspection had been done, after checking on the metadata MySQL, the error occurs probably on the staging layer.

 

Some of my colleagues said that I need to make a symbolic link for the java. It happened once, but we don't know how it occurs again right now.

 

Thank you.

 

Regards,

Sulaiman

 


Main Job.PNG
Persistence Layer.PNG
Staging Layer.PNG
Anonymous
Not applicable
Author

Or other solutions on the internet said that I need to upgrade the mysql java connector.
Anonymous
Not applicable
Author

Hello,

Could you please clarify in which Talend version/edition you are? Talend open studio for bigdata product or talend bigdata subscription solution? A webex session will be preferred.

Best regards

Sabrina

Anonymous
Not applicable
Author

Hello Sabrina,

 

I am currently using Talend Real-Time Big Data Platform 6.3.1.20161216_1026. Talend Open Studio for Big Data Product.

 

Thank you.

 

Regards,

Sulaiman

Anonymous
Not applicable
Author

FYI,

 

Some of the schemas mentioned earlier, it returns some of errors besides NoClassDefFoundError.

 

It returns this error :

 

[FATAL]: bda_prod.bmstgecn_main_initial_0_1.BMSTGECN_Main_Initial - tRunJob_19 Child job returns 1. It doesn't terminate normally.
Note: /tmp/sqoop-talenduser/compile/527db3601d0693db04ba2d48540e7f4e/QueryResult.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Exception in thread "main" java.lang.OutOfMemoryError: unable to create new native thread
    at java.lang.Thread.start0(Native Method)
    at java.lang.Thread.start(Thread.java:714)
    at org.apache.hadoop.hdfs.LeaseRenewer.put(LeaseRenewer.java:326)
    at org.apache.hadoop.hdfs.DFSClient.beginFileLease(DFSClient.java:854)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1742)
    at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1663)
    at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:405)
    at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:401)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:401)
    at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:344)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:920)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:901)
    at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:798)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:368)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:341)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:292)
    at org.apache.hadoop.mapreduce.JobResourceUploader.copyRemoteFiles(JobResourceUploader.java:203)
    at org.apache.hadoop.mapreduce.JobResourceUploader.uploadFiles(JobResourceUploader.java:128)
    at org.apache.hadoop.mapreduce.JobSubmitter.copyAndConfigureFiles(JobSubmitter.java:99)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:194)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
    at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:196)
    at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:169)
    at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:266)
    at org.apache.sqoop.manager.SqlManager.importQuery(SqlManager.java:729)
    at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:499)
    at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605)
    at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.tSqoopImport_1Process(PAY_INST_CAT_to_Hive_Initial.java:2996)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.tHiveCreateTable_1Process(PAY_INST_CAT_to_Hive_Initial.java:2680)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.tHiveConnection_1Process(PAY_INST_CAT_to_Hive_Initial.java:2397)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.tMysqlConnection_1Process(PAY_INST_CAT_to_Hive_Initial.java:2076)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.tJava_3Process(PAY_INST_CAT_to_Hive_Initial.java:1842)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.runJobInTOS(PAY_INST_CAT_to_Hive_Initial.java:8640)
    at bda_prod.pay_inst_cat_to_hive_initial_0_1.PAY_INST_CAT_to_Hive_Initial.runJob(PAY_INST_CAT_to_Hive_Initial.java:7926)
    at bda_prod.pay_inst_cat_main_initial_0_1.PAY_INST_CAT_Main_Initial.tRunJob_1Process(PAY_INST_CAT_Main_Initial.java:2741)
    at bda_prod.pay_inst_cat_main_initial_0_1.PAY_INST_CAT_Main_Initial.tWaitForSqlData_1Process(PAY_INST_CAT_Main_Initial.java:2208)
    at bda_prod.pay_inst_cat_main_initial_0_1.PAY_INST_CAT_Main_Initial.tOracleConnection_1Process(PAY_INST_CAT_Main_Initial.java:1917)
    at bda_prod.pay_inst_cat_main_initial_0_1.PAY_INST_CAT_Main_Initial.runJobInTOS(PAY_INST_CAT_Main_Initial.java:7102)
    at bda_prod.pay_inst_cat_main_initial_0_1.PAY_INST_CAT_Main_Initial.main(PAY_INST_CAT_Main_Initial.java:6385)

[ERROR]: bda_prod.bmstgecn_main_initial_0_1.BMSTGECN_Main_Initial - tParallelize_1 - null

The NoClassDefFoundError error appeared intermittently and I can't guess how it appeared and why. But for further information about this incident, the job is currently experiencing "Exception in thread "main" java.lang.OutOfMemoryError: unable to create new native thread" error more.

 

If I may ask, how can I have the webex session?

 

Thank you.

 

Regards,

Sulaiman Affandi

 

Anonymous
Not applicable
Author

Hello,

The best place to start is this website - https://login.talend.com/support-login.php.

Please refer to Talend Support Guide and let us know if it is OK with you.

0683p000009Lym5.png

 

Best regards

Sabrina

Anonymous
Not applicable
Author

Hi Sabrina,

Thank you for the information, I have one account that listed me as a customer. I have already open a ticket with my other account listed as a customer so thank you very much for the help
Anonymous
Not applicable
Author

Hello,

We will appreciate it a lot if you could share the solution or workaround on forum.

Best regards

Sabrina