Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
YunusEmre
Partner - Contributor II
Partner - Contributor II

Talend tDBBulkExec is not transferring all data

Hi everyone,

I'm working on TOS. I have a data what has about 26 millon rows after tMap process. I want to insert all data to Oracle with tDBOutputBulk and tDBBulkExec. I am running tDBBulkExec after tDBOutputBulk. I can transfer all data to csv file with tDBOutputBulk however tDBBulkExec step isnt' transfer all data to Oracle database. It gets just about 11 million of 26 million.

YunusEmre_0-1742938045928.png

 

Do you have any idea?

Thanks a lot

Labels (2)
1 Solution

Accepted Solutions
mchapman
Employee
Employee

The bulk loader may have issues with the data in the file, like length, for example. So check the logs of the Oracle bulk loader to see if it is rejecting records.

View solution in original post

2 Replies
quentin-vigne
Partner - Creator II
Partner - Creator II

Hi @YunusEmre 

If you're not doing any step between the tDBOutputBulk and the tDBBulkExec maybe you could try using a tDBOutputBulkExec (the 2 components are merges together)

 

I think your probleme here is that :

- Your BulkExec is triggered before the full file has been written

or

- The BulkExec doesn't have time to finish writting data to your database and trigger the subjob before the end.

 

You could also try to add a tSleep to "wait' and see if you have more rows.

 

If it's still not working and you want more help, can you share some screenshots of your components setup ? To see if the file / action on table have the good parameters 

 

- Quentin

mchapman
Employee
Employee

The bulk loader may have issues with the data in the file, like length, for example. So check the logs of the Oracle bulk loader to see if it is rejecting records.