Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

rRunJob - Calling Standard Job from Spark Job Running in Yarn Cluster Mode ( MapR Cluser) is not working

Spark job in Running in yarn Cluster Mode in MapR Cluster.

On SubJobOK , at the end of job , I am invoking a Standard Job. 

I am getting  class not found Exception. Upon looking log , I fount that the jar is being referred from user cache. Path is something like this - 

/tmp/hadoop-mapruat/nm-local-dir/usercache/useId/appcache/application_343434343434_1567/container_e8375037937593695474_1567_02_000001/subJobTest.jar

 

Exact same Job works fine if I run the Parent Spark Job in Yarn Client mode

Spark Job – Running in Yarn Cluster Mode – Calling Standard Job thru tRunJob - > Invocation of Sub Job in Failing

Spark Job – Running in Yarn Client Mode – Calling Standard Job thru tRunJob - > Invocation of Sub Job  is working fine.

 

Question – Is the subjob invocation from Spark-Yarn-Mode supported in Talend ? if not what is the work around to invoke Standard  job from Spark Job running in yarn Cluster Mode.

 
Labels (3)
1 Reply
Anonymous
Not applicable
Author

we have decided not to invest in talend studio for our big data pipeline partly because its diffucult to get help on talend issues.