<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: talend real time loading issues in Talend Studio</title>
    <link>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225607#M17831</link>
    <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You have 2 possibilities in my opinion,&lt;/P&gt;
&lt;P&gt;1/ use a look up using tmap component with your table in order to insert only the rejected records, don't forget to use disk storage for big files.&lt;/P&gt;
&lt;P&gt;2/ store in a file somewhere the number of rows inserted in the previous execution using a ((Integer)globalMap.get("tDBOutput_X_NB_LINE_INSERTED")) global variable&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Tue, 03 Mar 2020 21:49:58 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2020-03-03T21:49:58Z</dc:date>
    <item>
      <title>talend real time loading issues</title>
      <link>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225606#M17830</link>
      <description>&lt;P&gt;1) Suppose you are having source text file named "A.txt" which contains billons of records.The task is to extract all the data from source file and loads into target table named " Table D".When millions of records are inserted into your target table, in the middle of your job for the particular record/data you get error.After resolving the error, when you execute the job again I want my job to do the following :-&lt;BR /&gt;The already inserted records should not be inserted again in the "target table D" keeping the performance of your job in mind.&lt;BR /&gt;How can we do?&lt;BR /&gt;Consider the cases : a) Source file may contain unique records&lt;BR /&gt;and b) It may contains duplicate records.&lt;/P&gt;</description>
      <pubDate>Sat, 16 Nov 2024 03:07:02 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225606#M17830</guid>
      <dc:creator>psr</dc:creator>
      <dc:date>2024-11-16T03:07:02Z</dc:date>
    </item>
    <item>
      <title>Re: talend real time loading issues</title>
      <link>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225607#M17831</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You have 2 possibilities in my opinion,&lt;/P&gt;
&lt;P&gt;1/ use a look up using tmap component with your table in order to insert only the rejected records, don't forget to use disk storage for big files.&lt;/P&gt;
&lt;P&gt;2/ store in a file somewhere the number of rows inserted in the previous execution using a ((Integer)globalMap.get("tDBOutput_X_NB_LINE_INSERTED")) global variable&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 03 Mar 2020 21:49:58 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225607#M17831</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2020-03-03T21:49:58Z</dc:date>
    </item>
    <item>
      <title>Re: talend real time loading issues</title>
      <link>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225608#M17832</link>
      <description>you may split on 2 time load file into a staging db then load into destination and manage error.</description>
      <pubDate>Wed, 04 Mar 2020 10:44:32 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Talend-Studio/talend-real-time-loading-issues/m-p/2225608#M17832</guid>
      <dc:creator>fdenis</dc:creator>
      <dc:date>2020-03-04T10:44:32Z</dc:date>
    </item>
  </channel>
</rss>

