Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Trying to convert a standard job to a big data batch job, and getting this?

I'm trying to convert a standard job to a big data batch job and get this error. Can anyone explain this to me? 

0683p000009M1Qw.png

 

I also have a context load job for the standard job that I'm converting. Any idea what I do with that since I cannot convert it to a big data batch job?

Labels (2)
1 Reply
Anonymous
Not applicable
Author

Hi,

 

     You cannot convert the entire logic from Standard job to Bigdata job. For example, if you want to push a file using tHDFSPut, you will have to use DI job. So when you move the flow, you will have to remove the logic with problematic components to separate DI job and the best idea will be to orchestrate them to achieve your business goal.

 

For example, there is no tHDFSConnection in Bigdata job. You will have to convert them to tHDFSConfiguration. If you are setting the connection details through metadata, it will be easy for you to delete the first component and replace them with second component.

 

Similar is the case with other components also. The goal of Bigdata jobs is to utilize the capability of Spark framework when you are processing within HDFS layer. But if you want to delete a file using tFileDelete, Spark cannot increase the performance there and you will have to use the DI flow. 

 

Warm Regards,

 

Nikhil Thampi

 

 Warm Regards,

 

Nikhil Thampi