Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Discover the Trends Shaping AI in 2026: Register Here!
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tSalesforceBulkExecute, 65000 row limit?

Hi, I have a job that uses tSalesforceOutputBulk and tSalesforceBulkExec to write data to Salesforce using the bulk API. I'm using TOS-DI 5.5 and for my Salesforce connection I'm using version 25. I'm reading from a MySQL table with 200k rows, mapping to a tSalesforceOutputBulk components which writes the data to a CSV files, then triggering a tSalesforceBulkExec component which reads the CSV file and performs the Salesforce inserts. The inserts are working correctly except that only 65000 of the 200000 rows are processed. Is this a limitation of Talend? CSV? Java? or am I missing a setting that would allow for all the rows to be processed?
Labels (5)
9 Replies
Anonymous
Not applicable
Author

Hi,
Is there any truncation error printed on console? Have you tried to use "Rejects" row from tSalesforceBulkExec to see if there is any rejected row?
Best regards
Sabrina
Anonymous
Not applicable
Author

Hi,
Is there any truncation error printed on console? Have you tried to use "Rejects" row from tSalesforceBulkExec to see if there is any rejected row?
Best regards
Sabrina

There is also a Salesforce console that you or your Salesforce Admin can log on to, to see what's happened with Bulk Loads. Sometimes this gives more information than Talend has. As said, Rejects are first port of call.
Anonymous
Not applicable
Author

Hi and thanks so much for the replies. There are no truncation errors present in the Talend console. I am outputting the results of the tSalesforceBulkExec component to success and error files. The issue is not really with some rows failing and some completing successfully, it's just that only 65k rows make it Salesforce. In the Salesforce bulk data load monitoring tool it shows 65k rows being processed, some failing and some successful and the results are consistent with the tSalesforceBulkExec output that I can see in Talend. The strange thing to me is that when I process a file with >200k rows with tSalesforceBulkExec, exactly 65k rows make it to the bulk load job in Salesforce. I'm beginning to think that maybe Salesforce has limited the data on input, but really wanted to see if any others have also seen this same behavior. Thanks, cheers!
Anonymous
Not applicable
Author

Hi and thanks so much for the replies. There are no truncation errors present in the Talend console. I am outputting the results of the tSalesforceBulkExec component to success and error files. The issue is not really with some rows failing and some completing successfully, it's just that only 65k rows make it Salesforce. In the Salesforce bulk data load monitoring tool it shows 65k rows being processed, some failing and some successful and the results are consistent with the tSalesforceBulkExec output that I can see in Talend. The strange thing to me is that when I process a file with >200k rows with tSalesforceBulkExec, exactly 65k rows make it to the bulk load job in Salesforce. I'm beginning to think that maybe Salesforce has limited the data on input, but really wanted to see if any others have also seen this same behavior. Thanks, cheers!

I've loaded several millions of rows to Salesforce with the Bulk Loader, so if there is a fault, it may be an issue with later versions of Talend.
I believe that Talend Enterprise 5.1.1 was, at least, ok.
Anonymous
Not applicable
Author

Hi,
Could you please show us your job design screenshots? Did you use tMap in your work flow?
Best regards
Sabrina
Anonymous
Not applicable
Author

I'm going to run some more tests and get screenshots together but short answer is yes my job goes like this:
tSalesforceConnection -> tFileInputDelim -> tMap -> tSalesforceOutputBulk -> (on component ok) -> tSalesforceOutputBulkExec -> tFileOutputDelim (success) & tfileOutputDelim (error)
Anonymous
Not applicable
Author

Very possible that you are exceeding the limits and being throttled.
http://www.salesforce.com/us/developer/docs/api_asynch/Content/asynch_api_concepts_limits.htm
Anonymous
Not applicable
Author

Hi plsteffens,
Is there any update for your issue?
Best regards
Sabrina
_AnonymousUser
Specialist III
Specialist III

It would be nice to get an update on this issue.
.
Regards,
Derek