Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi - I'm in the process of removing a paid package from a Salesforce env and as part of that process, I've created a new object and need to transfer data from the existing, paid-for, object to the new object that we've created ourselves.
There are around 1.3m records that need to be created.
I'm able to process all 1.3m records right up to tSalesforceOutputBulkExec - tSalesforceOutputBulkExec creates the bulk .csv file with all 1.3m rows but the logs show that only 330,275 rows are processed.
I'm pretty new to Talend (this is my first talend job) and I'm struggling to understand why the remaining records arent being processed.
Any help would be very much appreciated
Thanks
Hey - Thanks so much for the quick reply.
I'm running the job in a full-copy Sandbox so should have no limit on the number of records I can create as long as it's within data storage/api limits. There's about 10GB worth of available space in the SF env and the number of records created should come to about 2.6GB of data storage.
I've just re-run the job with about 10% (144,540) of the required records and the same thing has happened. Only about 30% of the rows make it through to the logs. (screenshot attached) and the other 70% go "missing". (that's a similar ratio to when I ran with 1.3m records).
I've also provided a shot of the advanced settings on the tSalesforceOutputBulkExec component.
Also is there a way to track the progress of the tSalesforceOutputBulkExec job other than waiting for the logs to be created at the end? Like a real-time view of how many records are being created in the env per second?
Thanks again for the help.
M
Thanks for directing me to the bulk data load section in SF.
From there I was able to see that the missing 70% records were in batches that failed because they were unable to obtain exclusive access to a particular record.
TooManyLockFailure : Too many lock failure 200 Trying again later.;
It tried a bunch of times and then times out because it was never able to get exclusive access.
Now I just need to figure out how I can avoid the above locking error. I've too many records to be able to run the concurrency mode as 'Serial' - I'll start a new thread for that if I'm not able to locate an answer elsewhere in the forum.
Thanks again for your help