<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Spooling n-Records before Executing a Post Using tRest in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Spooling-n-Records-before-Executing-a-Post-Using-tRest/m-p/2287545#M61118</link>
    <description>&lt;P&gt;&lt;STRONG&gt;TLDR&lt;/STRONG&gt;: I need to spool 100 records from a larger file and then execute a post against my API using tRest.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;I have been trying to figure out how I would execute this scenario:&lt;/P&gt; 
&lt;P&gt;&amp;nbsp; &amp;nbsp; &lt;EM&gt;&amp;nbsp;Currently I am posting 1 record as a Post to my tRest component using an iterate connector, the service returns the response and I continue processing.&lt;/EM&gt;&lt;/P&gt; 
&lt;P&gt;What I need to do is post as many as &lt;STRONG&gt;100 records to the API in a chunked fashion&lt;/STRONG&gt; from the source file (250K lines), I need to spool up to 100 records then post them.&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Current Flow&lt;/P&gt; 
&lt;P&gt;File Input (500K Lines) --&amp;gt; tMap --&amp;gt;Iterate ---&amp;gt;tRest (post single) --&amp;gt; tHashOutPut (to Store response)&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Future Flow&lt;/P&gt; 
&lt;P&gt;&lt;SPAN&gt;File Input (500K Lines) --&amp;gt; tMap --&amp;gt;&lt;STRONG&gt;Spool-100-Lines&lt;/STRONG&gt; ---&amp;gt;tRest (post array of 100) --&amp;gt; tHashOutPut (to Store response)&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 01 Aug 2018 16:59:07 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2018-08-01T16:59:07Z</dc:date>
    <item>
      <title>Spooling n-Records before Executing a Post Using tRest</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Spooling-n-Records-before-Executing-a-Post-Using-tRest/m-p/2287545#M61118</link>
      <description>&lt;P&gt;&lt;STRONG&gt;TLDR&lt;/STRONG&gt;: I need to spool 100 records from a larger file and then execute a post against my API using tRest.&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;I have been trying to figure out how I would execute this scenario:&lt;/P&gt; 
&lt;P&gt;&amp;nbsp; &amp;nbsp; &lt;EM&gt;&amp;nbsp;Currently I am posting 1 record as a Post to my tRest component using an iterate connector, the service returns the response and I continue processing.&lt;/EM&gt;&lt;/P&gt; 
&lt;P&gt;What I need to do is post as many as &lt;STRONG&gt;100 records to the API in a chunked fashion&lt;/STRONG&gt; from the source file (250K lines), I need to spool up to 100 records then post them.&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Current Flow&lt;/P&gt; 
&lt;P&gt;File Input (500K Lines) --&amp;gt; tMap --&amp;gt;Iterate ---&amp;gt;tRest (post single) --&amp;gt; tHashOutPut (to Store response)&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;Future Flow&lt;/P&gt; 
&lt;P&gt;&lt;SPAN&gt;File Input (500K Lines) --&amp;gt; tMap --&amp;gt;&lt;STRONG&gt;Spool-100-Lines&lt;/STRONG&gt; ---&amp;gt;tRest (post array of 100) --&amp;gt; tHashOutPut (to Store response)&lt;/SPAN&gt;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt; 
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 01 Aug 2018 16:59:07 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Spooling-n-Records-before-Executing-a-Post-Using-tRest/m-p/2287545#M61118</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2018-08-01T16:59:07Z</dc:date>
    </item>
  </channel>
</rss>

