Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Custom Hadoop Distribution support to Spark components in Talend

I am working with a cluster where we have custom hadoop 2.4. I am trying to use talend with spark components. For the Spark Connection components, I have the set the relevant SparkHost, SparkHome.

For the distribution, the two available options are Cloudera and Custom( unsupported). When the Custom( unsupported ) distribution is selected, there is a provision to choose the custom Hadoop version to include the relavant libraries. The options available here are: Cloudera, HortonWorks, MapR, Apache, Amazon EMR, PivotalHD. However for me, when I choose Cloudera it comes with Hadoop 2.3 and I am assuming that all essential libraries are missing, and hence I get an "NoClassDefFoundError" which leads to not being able to load a file in Spark via this Spark connection. Btw, the spark version I have is 1.0.0

I would like to know how to fix this and a way to get this version of Spark running with Hadoop Certification.

The error is copied and pasted below:

 

[statistics] connecting to socket on port 3637

[statistics] connected

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/api/java/JavaSparkContext

    at sparktest.sparktest_0_1.sparktest.tSparkConnection_2Process(sparktest.java:491)

    at sparktest.sparktest_0_1.sparktest.runJobInTOS(sparktest.java:1643)

    at sparktest.sparktest_0_1.sparktest.main(sparktest.java:1502)

Caused by: java.lang.ClassNotFoundException: org.apache.spark.api.java.JavaSparkContext

    at java.net.URLClassLoader$1.run(URLClassLoader.java:372)

    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)

    at java.security.AccessController.doPrivileged(Native Method)

    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)

    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

    ... 3 more

[statistics] disconnected

Job sparktest ended at 13:19 21/10/2014. [exit code=1]

Thanks

Labels (3)
1 Reply
Anonymous
Not applicable
Author

Hello,

Could you please indicate on which talend build version you got this issue? Here exists a jira issue:https://jira.talendforge.org/browse/TBD-3774 about "spark job can't work with HDP2.3".

This issue has been fixed on 6.1.2, 6.2.1 .

Best regards

Sabrina