Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

How to use All standard job components in Big data Spark job

Hi All,

 

My requirement is to read data from Hive and doing lookup with Oracle Stored procedure result and loading to target database Oracle. I could do this using Standard job in Talend Big data but I want to implement this using Spark batch job to improve the performance. For that all required components are not there in Big data batch job. The other way to do this is calling this standard job in Big data batch job. But how it will work in Spark configuration, will it execution in Spark mode or in execute JVM. If so is there any other possibilities to do implement these kind of scenarios.

 

Any help would greatly be appreciated.

 

Thanks in advance,

Soujanya

Labels (2)
1 Reply
Anonymous
Not applicable
Author

Hello,

Could you please let us know if this documentation helps?

https://help.talend.com/reader/3JJwnKmjiy4~HSJfqeFJfA/VJ5n2R8WmYR2abJfSKsv8A 

Best regards

Sabrina