Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
MyrtoD
Contributor
Contributor

Load Data to Hive

Hi,

 

I am trying to find out the best practice in order to read a file, transform some of its values and then load the transformed data in a hive table. My problem is that the components available only allow entire file loading in Hive and not row by row data loading. Any ideas on how to do it ???

 

The only solution I have found this far is to store the output file in HDFS and then create a job, which will import the HDFS file in HIVE. However this seems too slow.... Additionally there used to exist a tHiveOutput component which would be perfect for what I am trying to do, but it is not available in the standard jobs any more. Any ideas ???

 

Thank,

Myrto

Labels (3)
2 Replies
Anonymous
Not applicable

Hello,

tHiveOutput only exists in spark, it is not a standard DI component.

Here comes tHiveCreateTable component which is running in the Standard Job framework. And you could use tHDFSPut component to load large-scale files.

Best regards

Sabrina

Anonymous
Not applicable

Hello,

tHiveOutput only exists in spark, it is not a standard DI component.

Here comes tHiveCreateTable component which is running in the Standard Job framework. And you could use tHDFSPut component to load large-scale files.

Best regards

Sabrina