Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us to spark ideas for how to put the latest capabilities into action. Register here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tOracleOutput getting hanged while inserting data into Oracle DB

I am using Talend Open Studion 5.3.0. My requirement is to read data from a CSV file and load the same into Oracle DB. I used the following components,
tFileInputDelimited for fetching the data from CSV file
tMAP to map the CSV and Oracle DB table schemas
tOracleOutput component to load the data into Oracle Table.

tFileInputDelimited --> tMAP --> tOracleOutput
When I tried with lesser number of rows (e.g. 20000) the data is getting loaded into the DB properly. When I tried the same job with records having more than one lakh rows exactly after 40000 insertions the job gets hanged i.e. no error, no update on the no. rows processed.
I tried with different combinations of Batch size and Commit values, but not able to resolve the issue.
However I tried the same just with addition of tOracleConnection and tOracleCommit and SUCCEEDED. But it took 2 hrs to load 1lakh rows of records (with the processing speed of 17rows /sec).
Can anyone help me to resolve the above mentioned issue?
Labels (3)
4 Replies
Anonymous
Not applicable
Author

Hi,
Perhaps it refers to performance for your a large of data.
Please refer to the KB article TalendHelpCenterutOfMemory, TalendHelpCenter0683p000009M9p6.pngtoring the lookup flow of tMap on the disk to see if it is useful for your current issue.
Best regards
Sabrina
Anonymous
Not applicable
Author

Thanks Sabrina.
I tried with the configurations specified in the specified articles, but the issue still remains (after 30000 or 40000 records the job hangs without any exception or error messages).
Also tried for the job where tOracleConnection has been used, still the processing records per sec hasn't crossed 15rows/sec.
Please suggest.
Anonymous
Not applicable
Author

Hi,
Could you please post your current job screenshot on forum so that I can get your situation more precisely.
Best regards
Sabrina
Anonymous
Not applicable
Author

>> I tried with different combinations of Batch size and Commit values, but not able to resolve the issue.
I did such things CSV --> Oracle
In tOracleOutput I set 100 commit
and check how job work by Select count(*) from
I do believe your problem is in BIG transaction on Oracle server.
about speed it depends :
in remote Talend server I also have such bad speed (no way to resolve)
but in local PC speed was OK
Ony problem with really big csv files was Java memory error - resolved by advanced settings in job - increasing memory
try to use sql*loader - quickest way for oracle - make ctl file
and if it works ok then use tOracleBulkExec component for loading into landing table
and then if data transformation (I/U/M) needed do it in server side