Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I was trying to build a Spark job in Talend, i was trying to look for global variable NB_LINE. I was not able to get that from Outline. Is there any option available in Spark?.. Please advice..
I also need some inputs regarding tSqlRow component, any sample SQL's that we can use in this component?.. Appreciate your inputs..
Regards
Sudhar
Hello,
Have you tried to utilize a DI Job to orchestrate the context of the Spark Job? Could you please have a look at this article:https://community.talend.com/t5/Architecture-Best-Practices-and/Spark-Dynamic-Context/ta-p/33038 to see if it can meet your needs.
Best regards
Sabrina
Thanks for sharing the link. My question was more towards how to get the record count of a link in Spark jobs. I am not able to see the options similar to regular DI job. Please advice
Thanks
Sudhar
Hello,
Could you please give us some description about your current bigdata spark jobs? The tflowmeter and tflowmetercatcher are not available in bigdata spark job, so far.
Best regards
Sabrina
Hi,
Our current requirement in Spark job is to get the job flow details..
We are trying to moving data from one layer to another layer, we would need to capture all the job metadata information (source file/target record count etc) from every job. This data need to stored for audit and reconciliation purpose.. I see that in Spark we are having limited option to get this information... I see that in regular DI we have all components that can help in getting all job information.. please provide your inputs for Spark execution.. will AMC be capturing the information that i am looking for..
Thanks
Sudhar
Hello,
Sorry for delay!
We have redirected your issue to talend bigdata experts and then come back to you as soon as we can.
Thanks for your time!
Best regards
Sabrina