<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Batch settings for tSalesforceOutputBulkExec to avoid Salesforce error: &amp;quot;CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:hed.TDTM_Contact: System.LimitException: Apex CPU time limit exceeded&amp;quot; in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Batch-settings-for-tSalesforceOutputBulkExec-to-avoid-Salesforce/m-p/2290093#M63407</link>
    <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;We are getting the following Salesforce error when loading large data sets (to Contact and Address) via Bulk API V1:&amp;nbsp;&lt;/P&gt;&lt;P&gt;"CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:hed.TDTM_Contact: System.LimitException: Apex CPU time limit exceeded"&lt;/P&gt;&lt;P&gt;We are aware of a number of factors affecting our performance (triggers, encryption, sharing rules,...) and are reviewing those -- but I am hoping to mitigate this error by modifying the batch settings on Advanced tab of tSalesforceOutputBulkExec component.&amp;nbsp;Note:&amp;nbsp;We are NOT using "Bulk API V2".&lt;/P&gt;&lt;P&gt;I see the batch sizes change in Salesforce when I modify "Rows to Commit".&amp;nbsp;But how does "Bytes to Commit" work together with Rows to Commit?&amp;nbsp;It seems I can't leave Bytes to Commit blank.&amp;nbsp;Also, can you explain further "Timeout in ms when checking Job or Batch state" and how this setting might affect CPU usage in Salesforce?&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Are there recommended settings I could try or strategies to use?&amp;nbsp;Thanks!&lt;/P&gt;</description>
    <pubDate>Sat, 16 Nov 2024 01:32:58 GMT</pubDate>
    <dc:creator>larasc</dc:creator>
    <dc:date>2024-11-16T01:32:58Z</dc:date>
    <item>
      <title>Batch settings for tSalesforceOutputBulkExec to avoid Salesforce error: "CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:hed.TDTM_Contact: System.LimitException: Apex CPU time limit exceeded"</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Batch-settings-for-tSalesforceOutputBulkExec-to-avoid-Salesforce/m-p/2290093#M63407</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt;&lt;P&gt;We are getting the following Salesforce error when loading large data sets (to Contact and Address) via Bulk API V1:&amp;nbsp;&lt;/P&gt;&lt;P&gt;"CANNOT_INSERT_UPDATE_ACTIVATE_ENTITY:hed.TDTM_Contact: System.LimitException: Apex CPU time limit exceeded"&lt;/P&gt;&lt;P&gt;We are aware of a number of factors affecting our performance (triggers, encryption, sharing rules,...) and are reviewing those -- but I am hoping to mitigate this error by modifying the batch settings on Advanced tab of tSalesforceOutputBulkExec component.&amp;nbsp;Note:&amp;nbsp;We are NOT using "Bulk API V2".&lt;/P&gt;&lt;P&gt;I see the batch sizes change in Salesforce when I modify "Rows to Commit".&amp;nbsp;But how does "Bytes to Commit" work together with Rows to Commit?&amp;nbsp;It seems I can't leave Bytes to Commit blank.&amp;nbsp;Also, can you explain further "Timeout in ms when checking Job or Batch state" and how this setting might affect CPU usage in Salesforce?&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Are there recommended settings I could try or strategies to use?&amp;nbsp;Thanks!&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 01:32:58 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Batch-settings-for-tSalesforceOutputBulkExec-to-avoid-Salesforce/m-p/2290093#M63407</guid>
      <dc:creator>larasc</dc:creator>
      <dc:date>2024-11-16T01:32:58Z</dc:date>
    </item>
  </channel>
</rss>

