Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik and ServiceNow Partner to Bring Trusted Enterprise Context into AI-Powered Workflows. Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

[resolved] How can I export tables from MySQL to Hive/Hadoop?

How can I export tables from MySQL to Hive/Hadoop? Which components will help? like tHiveConnection,tHiveRow or tHiveClose?
Thank you for your replies.
Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Please note that a tHiveInput component is planned to get released on 5.0.

View solution in original post

5 Replies
Anonymous
Not applicable
Author

Hi,
Absolutely, you can combine a tMySQLInput and a tHiveRow to do that!
It would look like that:
tHiveConnection
|
|
v
tMySQLInput ----> tHiveRow
|
|
v
tHiveClose
_AnonymousUser
Specialist III
Specialist III

Could you provide more detail on how to use the tHiveRow component here? You can't do a single-row insert in Hive/HQL.
Thanks.
---Paul
Anonymous
Not applicable
Author

Could you provide more detail on how to use the tHiveRow component here? You can't do a single-row insert in Hive/HQL.
Thanks.
---Paul

vgalopin is right,but this method is inefficient,data is imported one by one.Actually, I recommend "load overwrite" command used in tHiveRow's query like this "load data local inpath '"+context.hive_file_dir+"t_test.txt' overwrite into table t_test"
Anonymous
Not applicable
Author

Through using this tool(Talend),I find it supports hive/hadoop not very well.Free 4.2.3 version only provide three components,which are tHiveConnection,tHiveRow and tHiveClose.
Anonymous
Not applicable
Author

Please note that a tHiveInput component is planned to get released on 5.0.