Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
saikrishna5
Contributor
Contributor

Spark Job Configuration error

I am getting an error in Big Data batch job for Spark to add hdp version in spark-env.sh. I am using hortonworks HDP2.5.3.0 with ambari 2.4.2.0 and it is a 4 node cluster. I have added the HDP version in spark-env.sh from ambari and restarted the spark. But it still shows the same issue. I have attched screen shots of my error, Spark Configuration, advanced settings of job and spark-env.sh content
0683p000009MDLG.png 0683p000009MDRD.png 0683p000009MDRD.png 0683p000009MDSx.png
Labels (3)
4 Replies
Anonymous
Not applicable

Hi,
Could you please also indicate on which build version you got this issue?
Best regards
Sabrina
Anonymous
Not applicable

You can usually find these in your YARN config in Ambari:
Example below where 2.3.2.0-2950 is your HDP version
spark.driver.extraJavaOptions="-Dhdp.version=2.3.2.0-2950"
spark.yarn.am.extraJavaOptions="-Dhdp.version=2.3.2.0-2950"
spark.hadoop.mapreduce.application.framework.path="$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure"
spark.hadoop.mapreduce.application.classpath="/hdp/apps/2.3.2.0-2950/mapreduce/mapreduce.tar.gz#mr-framework"
saikrishna5
Contributor
Contributor
Author

Thanks for your reply Justin. I tried the way you suggested but it still shows me the spark context did not initialize error.
  0683p000009MDT2.png
I am using 6.3.1 version of Talend.
Anonymous
Not applicable

Sorry, I'm unsure then, I've only set it up in 6.2 so far