Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026 Agenda Now Available: Explore Sessions
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tSqoopimport in Java API mode

Hello Friends,

I got a requirement to ingest data from mssql to hortonworks hive database (the requirement is to create hdfs file and table dynamically  - without defining any schema). So I am planned to use tSqoopImport component. Now I am using Java API mode and able to load the data into HDFS file. but the problem here is its working only for text and sequence files, not working for avro and parquet files.

now the questions are, 1) Is it possible to work with the other file formats also?

2) Is it possible to change the hdfs text files delimiter (by default its taking "," comma)?

3) Is it possible to load the hive tables using the same tsqoopimport component?

Labels (4)
2 Replies
Anonymous
Not applicable
Author

Hello,

Please have a  look at this reference about:TalendHelpCenter:Which big data formats are supported.

Best regards

Sabrina

Anonymous
Not applicable
Author

Hello,

If you want to load data to Hive in bigdata spark job, please have a look at this example shared on talend help center.

TalendHelpCenter: Loading the database table data into the Hive internal table

Best regards

Sabrina