Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Spark job in Running in yarn Cluster Mode in MapR Cluster.
On SubJobOK , at the end of job , I am invoking a Standard Job.
I am getting class not found Exception. Upon looking log , I fount that the jar is being referred from user cache. Path is something like this -
/tmp/hadoop-mapruat/nm-local-dir/usercache/useId/appcache/application_343434343434_1567/container_e8375037937593695474_1567_02_000001/subJobTest.jar
Exact same Job works fine if I run the Parent Spark Job in Yarn Client mode
Spark Job – Running in Yarn Cluster Mode – Calling Standard Job thru tRunJob - > Invocation of Sub Job in Failing
Spark Job – Running in Yarn Client Mode – Calling Standard Job thru tRunJob - > Invocation of Sub Job is working fine.
Question – Is the subjob invocation from Spark-Yarn-Mode supported in Talend ? if not what is the work around to invoke Standard job from Spark Job running in yarn Cluster Mode.
we have decided not to invest in talend studio for our big data pipeline partly because its diffucult to get help on talend issues.