Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I'm trying to convert a standard job to a big data batch job and get this error. Can anyone explain this to me?
I also have a context load job for the standard job that I'm converting. Any idea what I do with that since I cannot convert it to a big data batch job?
Hi,
You cannot convert the entire logic from Standard job to Bigdata job. For example, if you want to push a file using tHDFSPut, you will have to use DI job. So when you move the flow, you will have to remove the logic with problematic components to separate DI job and the best idea will be to orchestrate them to achieve your business goal.
For example, there is no tHDFSConnection in Bigdata job. You will have to convert them to tHDFSConfiguration. If you are setting the connection details through metadata, it will be easy for you to delete the first component and replace them with second component.
Similar is the case with other components also. The goal of Bigdata jobs is to utilize the capability of Spark framework when you are processing within HDFS layer. But if you want to delete a file using tFileDelete, Spark cannot increase the performance there and you will have to use the DI flow.
Warm Regards,
Nikhil Thampi
Warm Regards,
Nikhil Thampi