Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I'm trying to convert a standard job to a big data batch job and it's saying that some components are not loaded and the job is not running at all. Can anyone explain what's happening here to a beginner?
Hi,
Bigdata jobs will not have all the components which standard job is having. For example file processing components.
It will be having the components which Spark can be used for better throughput in parallel mode.
Some of the standard job components (for example renaming of a file), will not be applicable in Bigdata context and will not be there. You will have to use a combination of Standard and BD job in a properly orchestrated manner to perform this action.
I have added the link for the latest version of Talend about the standard to BD job conversion for your quick reference.
https://help.talend.com/reader/FnHYY1jWCvZe5NolmUNMdQ/ExViCBV308c7vgcB3vtpfw
Please refer the help document based on your Talend version and the steps will be similar to details mentioned in above link.
If the answer has helped you, could you please mark the topic as resolved? Kudos are also welcome 🙂
Warm Regards,
Nikhil Thampi
It is because Big Data Batch job has its own set of components (mostly). However, some of the standard job components can still be used while creating Batch jobs. Take a look to the below URL-
https://help.talend.com/reader/3JJwnKmjiy4~HSJfqeFJfA/kca93OIcIGR0gDfvi4JylA