Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026! Turn data into bold moves, April 13 -15: Learn More!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

ProcessingSpeed Up Processing From tOracleInput to tHDFSOutput

I am running a job that is pulling data from an Oracle DB from a remote server, and I am trying to push that table into HDFS. The best I am getting is 4100 rows per second, and there is total of 53 million rows. I have six tables like that. 
I have set JVM setting 
-xms 16GB | -xmx 32GB
What Can I do to increase the performance, at this rate, I will have all the data loaded in over 12 hours? 
Thanks,

Labels (2)
2 Replies
Anonymous
Not applicable
Author

Hi,
Usually, we are using the tsqoopimport to load the data to HDFS from a relational database management system (RDBMS), MySql, Oracle..
Please take a look at component reference about:TalendHelpCenter:tSqoopImport.
Best regards
[font=noto, Helvetica, Arial, sans-serif]Sabrina[/font]
Anonymous
Not applicable
Author

What if in case I want to load the data from Oracle in memory first and then do processing on it?
If I use sqoop then I will have to get data to HDFS first then read from there. There will 2 I/O operations involved here. In case I use tOracleInput, then data will come in memory. I will do direct processing on it and then load it in HDFS.

Which according to you is the better approach?