Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tSalesforceOutputBulk etc - why not allow standard CSV files

Have been trying to process large numbers of SF records (~2m) and have done so using the tSalesforceOutputBulkExec component. However I wanted to break the process down into slightly more manageable chunks (100k at a time), so it seems I have to prepare the files using tFileOutputDelimited to provide the automatic splitting capability. What I cannot see is why I cannot then seem to just send those to SF for processing? It seems from the docs and trying it that I need to iterate over my split CSV files and run them back into tSalesforceOutputBulkExec (which creates a new, virtually identical CSV file) which it then sends to SF for processing. It is not clear why this is necessary?
Also In am sending just a record Id and a new field value into the flow but on the Reject row out of tSalesforceOutputBulkExec I see only the new field value and an error message, it doesn't seem to be passing thru the important bit, the Id of the actual record that failed to update - is this a bug or am I missing something?
Ta, M
Labels (4)
10 Replies
Anonymous
Not applicable
Author

it's a rolling 24 hours, so it may be a little sooner  0683p000009MACn.png