Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
mshafeeq
Contributor III
Contributor III

connect HDFS and hive using spark framework

Hello,

i'm asking if there is anyway to connect/run spark job (big data batch) containing Thdfsput ,thivecreatetable and thiveload components ?

also same for impala components

thanks.

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable

Hello,

You are able to configure tHDFSConfiguration ,tHiveInput, tHiveOutput and tHiveConfiguration components running in the Spark Batch Job framework.

thdfsput, thivecreatetable and thiveload are running in the Standard Job framework.

Regarding of this online component reference about: TalendHelpCenter:ImpalaComponent

Impala components are running in the Standard Job framework as well not the Spark Batch job.

 

Best regards

Sabrina

View solution in original post

1 Reply
Anonymous
Not applicable

Hello,

You are able to configure tHDFSConfiguration ,tHiveInput, tHiveOutput and tHiveConfiguration components running in the Spark Batch Job framework.

thdfsput, thivecreatetable and thiveload are running in the Standard Job framework.

Regarding of this online component reference about: TalendHelpCenter:ImpalaComponent

Impala components are running in the Standard Job framework as well not the Spark Batch job.

 

Best regards

Sabrina