<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Big data query and insert to Vertica in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Big-data-query-and-insert-to-Vertica/m-p/2226175#M18186</link>
    <description>Are you using COPY under "Action on data" ? 
&lt;BR /&gt;How large is your 4 million rows in size on disk? 
&lt;BR /&gt;Take a look at 
&lt;A href="http://my.vertica.com/docs/Ecosystem/TalendHPVerticaTipsandTechniques.pdf" target="_blank" rel="nofollow noopener noreferrer"&gt;Talend Vertica Tips &amp;amp; Techniques&lt;/A&gt;.</description>
    <pubDate>Thu, 04 Feb 2016 03:17:27 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2016-02-04T03:17:27Z</dc:date>
    <item>
      <title>Big data query and insert to Vertica</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Big-data-query-and-insert-to-Vertica/m-p/2226174#M18185</link>
      <description>Hello,&amp;nbsp; 
&lt;BR /&gt;I have a series of files that need to be loaded into a Vertica table from a Vertica Query and these row numbers range from 100 million to 300 million rows (possibly more) per file (10GB to 30GB). I have devised the attached mapping to do this. 
&lt;BR /&gt;The insert is a query with some joins to other tables. The tJava's you can see are really for logging and monitoring and merely show the time this stage of the job has run. The tMap is there to allow 4 millions rows to commit at a time. 
&lt;BR /&gt; 
&lt;B&gt;The Question&lt;/B&gt; 
&lt;BR /&gt;There isn't much else to it really. &amp;nbsp;It takes about 45 minutes to load the data to the 
&lt;B&gt;tVerticaOutputBulkExec &lt;/B&gt;which I am happy with. &amp;nbsp;I will also be happy if it takes less than 5 hours to load the data to the table but I was wondering if I could improve on the 
&lt;B&gt;tVerticeBulkExec &lt;/B&gt;and insert the rows faster? 
&lt;BR /&gt; 
&lt;A href="https://community.talend.com/legacyfs/online/membersTempo/298226/blob.png" target="_blank"&gt;&lt;IMG src="https://community.talend.com/legacyfs/online/membersTempo/298226/blob.png" /&gt; &lt;/A&gt;</description>
      <pubDate>Wed, 03 Feb 2016 10:32:50 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Big-data-query-and-insert-to-Vertica/m-p/2226174#M18185</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-02-03T10:32:50Z</dc:date>
    </item>
    <item>
      <title>Re: Big data query and insert to Vertica</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Big-data-query-and-insert-to-Vertica/m-p/2226175#M18186</link>
      <description>Are you using COPY under "Action on data" ? 
&lt;BR /&gt;How large is your 4 million rows in size on disk? 
&lt;BR /&gt;Take a look at 
&lt;A href="http://my.vertica.com/docs/Ecosystem/TalendHPVerticaTipsandTechniques.pdf" target="_blank" rel="nofollow noopener noreferrer"&gt;Talend Vertica Tips &amp;amp; Techniques&lt;/A&gt;.</description>
      <pubDate>Thu, 04 Feb 2016 03:17:27 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Big-data-query-and-insert-to-Vertica/m-p/2226175#M18186</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-02-04T03:17:27Z</dc:date>
    </item>
  </channel>
</rss>

