<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic GC Overhead limit exceeded on server in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299504#M71844</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt; 
&lt;P&gt;I am loading around 7.5 million records from a db and after transformation the size of the records doubles to 15 million. But during execution my job got failed with Error "GC Overhead Limit exceeded on server". I am putting xmx - 10240&amp;nbsp;MB and xms - 1024 MB. Please find my job design below.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;tmssqlinput--&amp;gt;tJavaRow--&amp;gt;tExtractJsonFields --&amp;gt; tMap --&amp;gt; tDenormalize --&amp;gt; tJavaRow --&amp;gt; tMSSQLOutput&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;At tExtractJSONFields data is getting double and at tDenormalize I am merging 2 records to One. I am getting this error at tDenormalize. Is there any better solution to this flow because tDenormalize is holding complete and then passing it one by one.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Best Regards,&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Abhishek&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Fri, 27 Apr 2018 07:33:29 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2018-04-27T07:33:29Z</dc:date>
    <item>
      <title>GC Overhead limit exceeded on server</title>
      <link>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299504#M71844</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt; 
&lt;P&gt;I am loading around 7.5 million records from a db and after transformation the size of the records doubles to 15 million. But during execution my job got failed with Error "GC Overhead Limit exceeded on server". I am putting xmx - 10240&amp;nbsp;MB and xms - 1024 MB. Please find my job design below.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;tmssqlinput--&amp;gt;tJavaRow--&amp;gt;tExtractJsonFields --&amp;gt; tMap --&amp;gt; tDenormalize --&amp;gt; tJavaRow --&amp;gt; tMSSQLOutput&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;At tExtractJSONFields data is getting double and at tDenormalize I am merging 2 records to One. I am getting this error at tDenormalize. Is there any better solution to this flow because tDenormalize is holding complete and then passing it one by one.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Best Regards,&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Abhishek&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 27 Apr 2018 07:33:29 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299504#M71844</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2018-04-27T07:33:29Z</dc:date>
    </item>
    <item>
      <title>Re: GC Overhead limit exceeded on server</title>
      <link>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299505#M71845</link>
      <description>&lt;P&gt;Hello,&lt;/P&gt; 
&lt;P&gt;Could you please try with 4096 or bigger if You have this memory free on Talend Machine?&lt;/P&gt; 
&lt;P&gt;Best regards&lt;/P&gt; 
&lt;P&gt;Sabrina&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 27 Apr 2018 08:56:04 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299505#M71845</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2018-04-27T08:56:04Z</dc:date>
    </item>
    <item>
      <title>Re: GC Overhead limit exceeded on server</title>
      <link>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299506#M71846</link>
      <description>Hi xdishi, 
&lt;BR /&gt;Xmx is already 10240 MB, almost twice as that of 4096. 
&lt;BR /&gt;Or should I increase the xms value ? But I think increasing the memory is not a permanent solution because in our case records count can increase to 40 - 50 millions. In that case we need to again increase the memory. 
&lt;BR /&gt;What I am looking for is a permanent solution. Can we process the records in batches. I mean instead of reading all the records in one go, can we do it in batches. 
&lt;BR /&gt; 
&lt;BR /&gt;Best Regards, 
&lt;BR /&gt; 
&lt;BR /&gt;Abhishek</description>
      <pubDate>Fri, 27 Apr 2018 09:47:11 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/GC-Overhead-limit-exceeded-on-server/m-p/2299506#M71846</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2018-04-27T09:47:11Z</dc:date>
    </item>
  </channel>
</rss>

