Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Eddie_Liu
Contributor
Contributor

Unix shell without classpath/reference of sub jobs' java lib

We have a job that include 6 sub jobs, each sub job works well and fully tested, but when we try to run the main job, it throws following exceptions. The root cause is that the shell doesn't include sub jobs' jar reference

Anyone knows about it, how to add sub jobs' reference jar to shell.?

Exception in thread "main" java.lang.NoClassDefFoundError: org/talend/bigdata/dataflow/hmap/PostProcessor

    at java.lang.Class.getDeclaredMethods0(Native Method)

    at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)

    at java.lang.Class.privateGetMethodRecursive(Class.java:3048)

    at java.lang.Class.getMethod0(Class.java:3018)

    at java.lang.Class.getMethod(Class.java:1784)

    at org.apache.spark.deploy.yarn.ApplicationMaster.startUserApplication(ApplicationMaster.scala:679)

    at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:461)

    at org.apache.spark.deploy.yarn.ApplicationMaster.org$apache$spark$deploy$yarn$ApplicationMaster$$runImpl(ApplicationMaster.scala:305)

    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$1.apply$mcV$sp(ApplicationMaster.scala:245)

    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$1.apply(ApplicationMaster.scala:245)

    at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$run$1.apply(ApplicationMaster.scala:245)

    at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$3.run(ApplicationMaster.scala:783)

    at java.security.AccessController.doPrivileged(Native Method)

    at javax.security.auth.Subject.doAs(Subject.java:422)

    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)

    at org.apache.spark.deploy.yarn.ApplicationMaster.doAsUser(ApplicationMaster.scala:782)

    at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:244)

    at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:807)

    at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)

Caused by: java.lang.ClassNotFoundException: org.talend.bigdata.dataflow.hmap.PostProcessor

    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)

    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

Labels (4)
3 Replies
Anonymous
Not applicable

Hello,

We are supposing that some of the JARs used by your components cannot be found in the current Java version.

What's JDK version are you using?

Best regards

Sabrina

Eddie_Liu
Contributor
Contributor
Author

We are using Talend Data Fabric with default JDK. The root cause is, Talend created Unix Shell without the reference of sub job's jar. Do you know how to solve it? Thanks.

Anonymous
Not applicable

Hello,

With your subscription solution, please create a support case on talend support portal so that we could give you a remote assistance to deliver a patch if needed through support cycle with priority.

Best regards

Sabrina