In Talend Studio, the Extra tab of a standard Job allows you to use the the Implicit Context Load feature to load context parameters dynamically at the time of Job execution.
However, the Extra tab is not available for a Spark Job in Talend Studio:
So how can you pass context parameters and use the Implicit Context Load feature with a Spark Job?
Answer
The solution is to use a standard Job, configured to use the Implicit Context Load feature, that invokes the Spark Job using the tRunJob component.
To pass the context parameters to the Spark Job, configure the tRunJob component settings with the Use an independent process to run subjob and the Transmit whole context options checked. Below is the tRunJob component configuration to invoke the Spark Job (my_simple_spark_test😞