<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Loading reference data with huge volume in model in App Development</title>
    <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110360#M90693</link>
    <description>&lt;P&gt;Thanks for the reply. There are no transformations , it is just a direct load of all columns.&amp;nbsp; Let me try out exists clause to limit rows. But I have a situation to have a left outer join as well, as shown below.&lt;/P&gt;
&lt;P&gt;Table A:&amp;nbsp; Transaction data&lt;/P&gt;
&lt;P&gt;Table B: Reference data linked to Table A. Need only matched rows.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Table C: Reference data linked to Table B. Need matched rows with Table B, to pick its attributes.&lt;/P&gt;
&lt;P&gt;This is like a left outer join situation. But is Left outer join good solution based on performance of Qlik.&lt;/P&gt;
&lt;P&gt;Point to note table C is 60 million. Table A 10 million. Table B is 3 million. Final output expected is 10 million rows.&lt;/P&gt;</description>
    <pubDate>Mon, 28 Aug 2023 12:06:04 GMT</pubDate>
    <dc:creator>tknagaraj</dc:creator>
    <dc:date>2023-08-28T12:06:04Z</dc:date>
    <item>
      <title>Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110346#M90689</link>
      <description>&lt;P&gt;I have performance issue, where there is a big reference data table which need to be loaded in the app. Data is stored in QVD, but to load data it is taking around 15 minutes to load around 60 million rows.&lt;/P&gt;
&lt;P&gt;Any good design principle to apply here.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;My model is based on star schema.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Any pointers will help.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 28 Aug 2023 11:41:37 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110346#M90689</guid>
      <dc:creator>tknagaraj</dc:creator>
      <dc:date>2023-08-28T11:41:37Z</dc:date>
    </item>
    <item>
      <title>Re: Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110355#M90691</link>
      <description>&lt;P&gt;Make sure that the qvd-data are loaded optimized. This means no transforming is applied to these data - the only exception is single where exists(Key) clause.&lt;/P&gt;</description>
      <pubDate>Mon, 28 Aug 2023 11:56:20 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110355#M90691</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2023-08-28T11:56:20Z</dc:date>
    </item>
    <item>
      <title>Re: Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110356#M90692</link>
      <description>&lt;P&gt;If loading a QVD takes an excessive amount of time:&lt;/P&gt;
&lt;P&gt;* Reduce the size of the QVD by eliminating unnecessary columns, storing data efficiently, etc.&lt;/P&gt;
&lt;P&gt;and/or&lt;/P&gt;
&lt;P&gt;* Change the load to ensure it is optimized&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 28 Aug 2023 11:59:13 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110356#M90692</guid>
      <dc:creator>Or</dc:creator>
      <dc:date>2023-08-28T11:59:13Z</dc:date>
    </item>
    <item>
      <title>Re: Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110360#M90693</link>
      <description>&lt;P&gt;Thanks for the reply. There are no transformations , it is just a direct load of all columns.&amp;nbsp; Let me try out exists clause to limit rows. But I have a situation to have a left outer join as well, as shown below.&lt;/P&gt;
&lt;P&gt;Table A:&amp;nbsp; Transaction data&lt;/P&gt;
&lt;P&gt;Table B: Reference data linked to Table A. Need only matched rows.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Table C: Reference data linked to Table B. Need matched rows with Table B, to pick its attributes.&lt;/P&gt;
&lt;P&gt;This is like a left outer join situation. But is Left outer join good solution based on performance of Qlik.&lt;/P&gt;
&lt;P&gt;Point to note table C is 60 million. Table A 10 million. Table B is 3 million. Final output expected is 10 million rows.&lt;/P&gt;</description>
      <pubDate>Mon, 28 Aug 2023 12:06:04 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110360#M90693</guid>
      <dc:creator>tknagaraj</dc:creator>
      <dc:date>2023-08-28T12:06:04Z</dc:date>
    </item>
    <item>
      <title>Re: Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110367#M90697</link>
      <description>&lt;P&gt;Left join seems to be doing the trick. Its loading data very fast. Not sure if it will have any impacts in future. But will stick with it for now.&lt;/P&gt;</description>
      <pubDate>Mon, 28 Aug 2023 12:13:41 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110367#M90697</guid>
      <dc:creator>tknagaraj</dc:creator>
      <dc:date>2023-08-28T12:13:41Z</dc:date>
    </item>
    <item>
      <title>Re: Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110388#M90700</link>
      <description>&lt;P&gt;Joins are (quite heavy) transformations. They are in general replaceable with mappings which often perform faster by being more powerful and flexible.&lt;/P&gt;
&lt;P&gt;Further using exists() to filter the data directly by loading a source respectively before applying any join/mapping-approach will also save resources and therefore run-times.&lt;/P&gt;</description>
      <pubDate>Mon, 28 Aug 2023 12:46:22 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110388#M90700</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2023-08-28T12:46:22Z</dc:date>
    </item>
    <item>
      <title>Re: Loading reference data with huge volume in model</title>
      <link>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110723#M90729</link>
      <description>&lt;P&gt;Thanks. Will try Exists.&lt;/P&gt;</description>
      <pubDate>Tue, 29 Aug 2023 08:33:48 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Loading-reference-data-with-huge-volume-in-model/m-p/2110723#M90729</guid>
      <dc:creator>tknagaraj</dc:creator>
      <dc:date>2023-08-29T08:33:48Z</dc:date>
    </item>
  </channel>
</rss>

