Skip to main content
Announcements
July 15, NEW Customer Portal: Initial launch will improve how you submit Support Cases. IMPORTANT DETAILS
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Problem with tsalesforceoutput main|reject flow when Commit level >1

Hello, 

I have an issue with tsalesforceoutput in talend 6.4.1.

While using the tsalesforceoutput with a commit level set above 1, the first batch (successfully updated within the salesforce org) isn't logged into the output files.

0683p000009LrEz.pngCommit lvl : 200 | Records updated : 2513

 

As you can see, 199 records (the first ones to be updated) were lost in the execution, but a quick check in salesforce shows that those records were successfully updated.

 

Could you point me onto the right direction to resolve this issue, or at least tell me which version of talend can be used without this error ?

Thanks.

 

Labels (3)
15 Replies
Anonymous
Not applicable
Author

Hi,

Have you tried to clear "Extend Insert" check box to see if rejected rows can be logged into your excel file? on ? Did you get NULL values?

 

Best regards

Sabrina

Anonymous
Not applicable
Author

Hi Sabrina, 

 

I've tried it, all the records are succesfully updated and logged in the excel file (no null value) but the time to do so is nearly 9 min, which is way too long.

 

Best regards, 

Nicolas

 


extendInsertUnchecked.PNG
Anonymous
Not applicable
Author

Hi, 

I've tried to resolve my problem for 15 days, but i am still stuck to the same point, have you found something ?

 

Thanks,

Nicolas

Anonymous
Not applicable
Author

Hello,

 

Try to add a tLogCatcher component to retrieve errors.

You can put the information in a file or sent them by email.

I think that it will help you.

Moreover, you can add a tLog component between the component tSalesforceOutput and the error file.

 

Best regards,

 

Marine

Anonymous
Not applicable
Author

Hi,

Please make sure that you have selected 'Cease on Error' check box to stop the execution of the Job when an error occurs in advanced settings of tSalesforceOutput.

Best regards

Sabrina

Anonymous
Not applicable
Author


Hi,

@MarineTiphon : I tried your method, it successfully logged all the records onto the excel (which is a good thing), but didn't separate the wrong ones from the good ones. It makes me wonder if the error is only on the Reject row ?

 

@xdshi: I selected the 'Cease on error' checkbox, it stops the job at the first wrong record, but doesn't tell me at exactly which row the error occurs, although it gives me the error's information


It kinda get closer from what i expect this component to do, but i would like (if possible) to get the import job
to go through the all of my records, then separate into two distincts files all the success / errors.

 

Best regards 

 

Nicolas


ceaseOnErrorChecked.PNG
errorReturned.PNG
tLogCatcher_CeaseOnErrorUnchecked.PNG
TRF
Creator III
Creator III

Hi,

@Moe, the following will not solve your case but may help.

 

It sounds like a bug or at least a strange behaviour from the tSalesforceOutput component or the salesforce API.

I've realized the following tests with 253 records (plus 1 header line) from an input file with 1 expected rejected record (due to blank name).

Here are the results:

  1. "Extend Insert" checked and "Commit level 200" : 54 record into the success flow and 54 into the error flow. Always the same record in this error flow but not the expected record and always the same error message! 
    252 records inserted into Salesforce as expected.
    (200 - 1) + 54 = 253 ==> which corresponds to the number of input records.
  2. "Extend Insert" checked and "Commit level 50" : 204 record into the success flow and 50 into the error flow. Always the same record in this error flow but not the expected record and always the same error message
    (50 - 1) + 204 = 253 ==> which corresponds to the number of input records.
  3. "Extend Insert" checked and "Commit level 1" :  252 record to the success flow and 1 to the error flow (as expected).
    Records are inserted 1 by 1 so response time is very bad and API consumption is high.
  4. "Extend Insert" unchecked and "Retrieve Id" checked or not : 252 records to the success flow and 1 to the error flow!
    Records are inserted 1 by 1 so response time is very bad and API consumption is high.

My conclusions:

When "Extend Insert" option is ticked, we know records are sended to the API by batches of n records where n depends on the value of "Commit Level" parameter (refer to Salesforce documentation).

With this option, as soon as an error occurs, result for success and reject flows is wrong (except when "Commit Level" = 1) but the operation on Salesforce side is rigth.

My question:

Where is this f**k**g bug?

My advices:

If you care about error messages, avoid options 1 and 2.

If you care about response time (with low data volume), choose option 1.

If you care about error messages and response time, consider Bulk option (should be reserved for high data volume, but sometimes...).

 

Feel free to complete or correct if necessary.

TRF
Creator III
Creator III

@Moe, does this help or not?

Please let us know and if it does, don't forget to give Kudo and accept the solution if you have solved your case thanks to the proposed solution.

Anonymous
Not applicable
Author

Hello,

 

 

I also faced this problem. What I noticed is that the header of the source file is not removed before the content is split into batches, but after. Consequently, the first batch of data to be sent to Salesforce will have number of data records = commit level -1. Probably an event is not triggered in this case because neither the main, nor the rejected flow continues. It is more obvious if you execute the job with a file that has less records than the batch size. You will see the status "Starting" on both flows.

 

 By analyzing your 1st example:

"

  1. "Extend Insert" checked and "Commit level 200" : 54 record into the success flow and 54 into the error flow. Always the same record in this error flow but not the expected record and always the same error message! 
    252 records inserted into Salesforce as expected.
    (200 - 1) + 54 = 253 ==> which corresponds to the number of input records.

"

Commit level 200

Input file total number of records: 254 = 253 data records + 1 header record;

Batch 1: 200 data records - 1 header records = 199 (below commit level, so the records are not transferred to the success/rejected files);

Batch 2: 54 remaining data records. Probably there is a special behaviour implemented for the last batch since the total number of records will not always be a multiplier of the batch size. However, this behaviour does not apply if the file has ONLY one incomplete batch.

 

Please try to execute the job with a file that does not have a header and let me know if it works. For me it did.

 

Have fun,

Florentina