Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi everyone,
I'm working on TOS. I have a data what has about 26 millon rows after tMap process. I want to insert all data to Oracle with tDBOutputBulk and tDBBulkExec. I am running tDBBulkExec after tDBOutputBulk. I can transfer all data to csv file with tDBOutputBulk however tDBBulkExec step isnt' transfer all data to Oracle database. It gets just about 11 million of 26 million.
Do you have any idea?
Thanks a lot
The bulk loader may have issues with the data in the file, like length, for example. So check the logs of the Oracle bulk loader to see if it is rejecting records.
Hi @YunusEmre
If you're not doing any step between the tDBOutputBulk and the tDBBulkExec maybe you could try using a tDBOutputBulkExec (the 2 components are merges together)
I think your probleme here is that :
- Your BulkExec is triggered before the full file has been written
or
- The BulkExec doesn't have time to finish writting data to your database and trigger the subjob before the end.
You could also try to add a tSleep to "wait' and see if you have more rows.
If it's still not working and you want more help, can you share some screenshots of your components setup ? To see if the file / action on table have the good parameters
- Quentin
The bulk loader may have issues with the data in the file, like length, for example. So check the logs of the Oracle bulk loader to see if it is rejecting records.