Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
hello,
I have created and implemented a job in production which loads more than 10000 records in target table on daily basis. i have used tDBoutput component as my target is a redshift table. i did not use any connection component to define the connections. i have defined all the connections in context variables only.
From past 2 days i have observed that the all the records are not getting loaded to Target table. If my source has 13250 records, then only 10000 records are being inserted as Advanced properties---> commit every---> has been set to 10000 records. when i decrease the commit every to 100 records, then it works fine.
is this some kind of bug with talend studio v 7.3? because it was working fine before, but recently it stops at 10000 records. what is cause and solution for this?
Hello @Sushant Kapoor ,
Please note that in the "Comment Every" field, you have to enter the number of rows to be completed before committing batches of rows together into the database. This option ensures transaction quality (but not rollback) and, above all, better performance at execution.
Help Documentation: https://help.talend.com/r/zi0QzwYSObsY0xZS0lJc3Q/nxfVGZYPHw7hwKN1v4m_QQ
Regards,
Vaishnavi
@Vaishnavi Khandelwal : i am ware of this property. its just not commiting the records after 10000. So, i have to reduce the commit every to "100" and it works fine now. i was just wondering if this is some kind of product bug or it happens this way only? rest all of my jobs are working fine with "commit every" 10000 rows
hello friends,
This issue keeps happening. Is there a way to know why the records do not commit after 10000. This behavior is very inconsistent & erratic. anyone else faced the same issue? please help.