<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Jobs running slow in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257047#M39257</link>
    <description>Hi, 
&lt;BR /&gt;Get ride of tLogRow, it is a debugging component to trace on console. 
&lt;BR /&gt;You should only use it on small amount od data in order to test (&amp;lt;10000 rows) 
&lt;BR /&gt;tLogRow will slow down the whole data flow to around 100row/sec... 
&lt;BR /&gt;benjamin</description>
    <pubDate>Wed, 22 Sep 2010 22:03:57 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2010-09-22T22:03:57Z</dc:date>
    <item>
      <title>Jobs running slow</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257045#M39255</link>
      <description>Hie 
&lt;BR /&gt;Could someone help me please: I have a job with an input delimited file which has about 500 000 rows could be more in future, I designed a simple job with so many lookup tables and processes using TlogRow and the Tmap being the main one. This job is taking long to run, it is taking more than a day or two to run. 
&lt;BR /&gt;Could someone tell me the best way/components to use for a job that will consists of inputdelimited csv files, MySql Tables to run faster.
&lt;BR /&gt;I cant seem to attach the screen shot right now, if possible i can email you, if you think you can help
&lt;BR /&gt;Many Thanks</description>
      <pubDate>Sat, 16 Nov 2024 13:16:47 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257045#M39255</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2024-11-16T13:16:47Z</dc:date>
    </item>
    <item>
      <title>Re: Jobs running slow</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257046#M39256</link>
      <description>There are various approaches you might take but you need to check where it is actually bottlenecking right now. Do the lookup tables have many differnt values or just a few? Are they cached? Indexed etc? If your mysql backend is on a more powerful system then you might stage the input data and then use a join (outer if necessary) to do the lookups and produce a new data stream to feed the rest of the job.
&lt;BR /&gt;T</description>
      <pubDate>Tue, 21 Sep 2010 15:22:41 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257046#M39256</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2010-09-21T15:22:41Z</dc:date>
    </item>
    <item>
      <title>Re: Jobs running slow</title>
      <link>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257047#M39257</link>
      <description>Hi, 
&lt;BR /&gt;Get ride of tLogRow, it is a debugging component to trace on console. 
&lt;BR /&gt;You should only use it on small amount od data in order to test (&amp;lt;10000 rows) 
&lt;BR /&gt;tLogRow will slow down the whole data flow to around 100row/sec... 
&lt;BR /&gt;benjamin</description>
      <pubDate>Wed, 22 Sep 2010 22:03:57 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/Jobs-running-slow/m-p/2257047#M39257</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2010-09-22T22:03:57Z</dc:date>
    </item>
  </channel>
</rss>

