[resolved] Load Oracle Data from one DB to another Oracle DB in Bulk
Hi, I am trying to run a talend job where i can take data from Oracle DB and output it to another Oracle DB. I am currently running a job that holds roughly 7M rows so using the (tOracleInput - to - tOracleOutput) take s a very long time.
I have seen a couple of components called tOracleOutputBulkExec and tOracleOutputBulk and tried to use them. Firstly I done know what the difference is between them, and secondly it doesnt do exactly what I wnat it to do. Boths these components write the data to a file and then im assuming it uploads this file to the Oracle Db using SQL Loader.
Is there any way I can transform 7M rows from one DB to another DB in Talend VERY Quickly?
Also is there a way where I can use Talend to break up large data into chunks and commit each chuck seperatley?
Hi In fact, using tOracleBulkExec and tOracleOutputBulk is an efficient way to insert data. tOracleInput-->tOracleOutputBulk | OnSubjobOk | tOracleBulkExec Could you show me screenshot of your job? This is an issue about optimization. Regards, Pedro
Hi In fact, using tOracleBulkExec and tOracleOutputBulk is an efficient way to insert data. tOracleInput-->tOracleOutputBulk | OnSubjobOk | tOracleBulkExec Could you show me screenshot of your job? This is an issue about optimization. Regards, Pedro
Hi No. This is just an example. In fact, when records come to 7M rows, the performance can't be VERY Quickly. You have to find bottleneck of your job and optimize step by step. Regards, Pedro
Hello,
I need to optimize a transaction of more than 7 million records in Talend. When I attempt to perform the operation raises two problems:
1. Consumption of all space Oracle "Table Espace" .
2. The time spent in the operation of over half an hour.
I tried solving the problem by optimizing the query, using the TMap "Max buffer size", and now attempt to test components and BulkExec OutputBulk but I've never suado and not if I solve my problem.