Batch settings for tSalesforceOutputBulkExec to avoid Salesforce error: "CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:hed.TDTM_Contact: System.LimitException: Apex CPU time limit exceeded"
Hello,
We are getting the following Salesforce error when loading large data sets (to Contact and Address) via Bulk API V1:
"CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:hed.TDTM_Contact: System.LimitException: Apex CPU time limit exceeded"
We are aware of a number of factors affecting our performance (triggers, encryption, sharing rules,...) and are reviewing those -- but I am hoping to mitigate this error by modifying the batch settings on Advanced tab of tSalesforceOutputBulkExec component. Note: We are NOT using "Bulk API V2".
I see the batch sizes change in Salesforce when I modify "Rows to Commit". But how does "Bytes to Commit" work together with Rows to Commit? It seems I can't leave Bytes to Commit blank. Also, can you explain further "Timeout in ms when checking Job or Batch state" and how this setting might affect CPU usage in Salesforce?
Are there recommended settings I could try or strategies to use? Thanks!