Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

[resolved] tSalesforceBulkExec Exeeded number of records : 10002

Hi all,

I'm currently working on passing my Talend jobs from 5.6.2 to 6.2.1, in order to use TLS 1.1 when connecting to Salesforce.

I have a strange issue on the newest release : this error on the tSalesforceBulkExec - Exeeded number of records : 10002

In the advanced settings of the component, the number of lines to commit is set to 10K



But I still have this error :



It's working fine on Talend 5.6.2 , with the same settings :



So what's going on?

Labels (2)
7 Replies
Anonymous
Not applicable
Author

Well the pictures are not working ...

Here is the error message :

Exception in component tSalesforceBulkExec_2
com.sforce.async.CSVReader$CSVParseException: Exceeded number of records : 10002. Number of records should be less than or equal to 10001
    at com.sforce.async.CSVReader.checkRecordExceptions(CSVReader.java:159)
TRF
Champion II
Champion II

Hi,
Strange... Try 6.1.1. I use it and don't have this problem including with large datafiles (500,000 records with lines to commit 10,000).
Maybe your input file is malformed (containing " in data or something like that).
Regards,
TRF
Anonymous
Not applicable
Author

Nothing wrong in the file I guess, as it contains Salesforce ids, two booleans columns and two integer columns 😕
TRF
Champion II
Champion II

Definitively, try 6.1.1
Anonymous
Not applicable
Author

I downloaded the last version (6.3.0) and it seems to work. Now I have a problem on the tfilelist and the tfiledelete but it seems to be a warning.
Thank you for your help @TRF 0683p000009MACn.png
TRF
Champion II
Champion II

You're welcome.
I'll try 6.3 as soon as possible.
Let us know what about tFileList ans tFileDelete.
Regards,
TRF
Anonymous
Not applicable
Author

yes this is a known regression in 6.2.0/6.2.1. 6.2.2 and 6.3.0 and later have this fixed again.