<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Application performance in Data Movement &amp; Streaming</title>
    <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501907#M3003</link>
    <description>&lt;P&gt;Hi &lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/313025"&gt;@RyugaHideki&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;A few things.&lt;/P&gt;
&lt;P&gt;1. Do you need to load the whole table everytime? So you can't use a incremental load on the db and store that to a qvd. If you do loads from a qvd this will be faster than a loading from the db.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2. How many times a day should this be updated? I know "4 hours" is long but if this is only updated once a day, running this at 3am in the morning "should be fine".&amp;nbsp;&lt;/P&gt;
&lt;P&gt;3. Depending on how your data looks, a count vs a rownumber / auto number might give the same results (just a example).&amp;nbsp;&lt;/P&gt;
&lt;P&gt;4. How big is your data set and how long is "forever"?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Quite keen to help you in this one!&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Regards - Jandre&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Sat, 18 Jan 2025 15:45:20 GMT</pubDate>
    <dc:creator>JandreKillianRIC</dc:creator>
    <dc:date>2025-01-18T15:45:20Z</dc:date>
    <item>
      <title>Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501900#M3002</link>
      <description>&lt;P&gt;hello guys.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;i'm just wondering how do you guys deal with large datasets that require multiple aggregations and treatements cause for me the loading time takes forever and sometimes fails to complete and using the application becomes quite laggy i tried to resort to making views for the aggregations and import them but the more i make the slower app becomes for example:&amp;nbsp;&lt;BR /&gt;Marque_Tom:&lt;BR /&gt;LOAD&lt;BR /&gt;[Marques],&lt;BR /&gt;Q35_DES,&lt;BR /&gt;Marques_TOM;&lt;BR /&gt;SELECT&lt;BR /&gt;Q35 as Marques,&lt;BR /&gt;Q35_DES,&lt;BR /&gt;Count(Q35) as Marques_TOM&lt;BR /&gt;FROM DataShare.dbo.FAIT_Q35&lt;BR /&gt;Group By Q35_DES,Q35;&lt;BR /&gt;became like this&amp;nbsp;&lt;BR /&gt;[Marque_TOM]:&lt;BR /&gt;SELECT Marque,&lt;BR /&gt;Vague,&lt;BR /&gt;Code,&lt;BR /&gt;"Marques_TOM"&lt;BR /&gt;FROM "Market_Study".dbo."Marque_TOM";&lt;/P&gt;
&lt;P&gt;it's just an example i removed aggregations from the data loading part and yes if it was this alone it would not be a problem but i have plenty more and it made it very slow cause it keeps creating correspondence keys&amp;nbsp; and basically impossible to use i hope you help me&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 19 Mar 2025 21:46:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501900#M3002</guid>
      <dc:creator>RyugaHideki</dc:creator>
      <dc:date>2025-03-19T21:46:28Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501907#M3003</link>
      <description>&lt;P&gt;Hi &lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/313025"&gt;@RyugaHideki&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;A few things.&lt;/P&gt;
&lt;P&gt;1. Do you need to load the whole table everytime? So you can't use a incremental load on the db and store that to a qvd. If you do loads from a qvd this will be faster than a loading from the db.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;2. How many times a day should this be updated? I know "4 hours" is long but if this is only updated once a day, running this at 3am in the morning "should be fine".&amp;nbsp;&lt;/P&gt;
&lt;P&gt;3. Depending on how your data looks, a count vs a rownumber / auto number might give the same results (just a example).&amp;nbsp;&lt;/P&gt;
&lt;P&gt;4. How big is your data set and how long is "forever"?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Quite keen to help you in this one!&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Regards - Jandre&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sat, 18 Jan 2025 15:45:20 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501907#M3003</guid>
      <dc:creator>JandreKillianRIC</dc:creator>
      <dc:date>2025-01-18T15:45:20Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501924#M3004</link>
      <description>&lt;P&gt;the update part not very often actually i thought the problem is aggregations i thought i'd solve the problem by using views but still&amp;nbsp;&lt;/P&gt;
&lt;P&gt;forever is about 10 mn? even after it's loaded it's kind of slow to load the data my dataset is quite big (mainy cause of aggregations) i'd saying 100k lines? as for the first part i don't think i quite know what you're talking about unfortunetly&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 19 Jan 2025 22:30:46 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501924#M3004</guid>
      <dc:creator>RyugaHideki</dc:creator>
      <dc:date>2025-01-19T22:30:46Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501934#M3005</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/313025"&gt;@RyugaHideki&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;Your query is not running on Qlik side but on your SQL Server - on both cases - so your problem is probably on database side, not on Qlik.&lt;/P&gt;
&lt;P&gt;There are tons of things you can do to improve your database performance. The common problems are lack of indexes in your columns, bad queries, concurrent processes, network issues and storage issues. Another possibility is the connection between your Database and Qlik. How are you connecting to your Database? Are you using Qlik Data Gateway? Which database are you using?&lt;/P&gt;
&lt;P&gt;For a quick test, you could try materialized views indexing properly your columns and see if you can improve your results. But I highly recommend checking with your DBA for help on this.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Regards,&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Mark Costa&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 20 Jan 2025 03:22:44 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2501934#M3005</guid>
      <dc:creator>marksouzacosta</dc:creator>
      <dc:date>2025-01-20T03:22:44Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502203#M3016</link>
      <description>&lt;P&gt;&lt;SPAN&gt;Hi&amp;nbsp;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/22035"&gt;@marksouzacosta&lt;/a&gt;&amp;nbsp;,&lt;/SPAN&gt;&lt;SPAN&gt;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/174995"&gt;@JandreKillianRIC&lt;/a&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;what i did was change the indexation now using a star schema all related by one primary key, (loaded views and fact table ), loading time does not take long now but the problem is exploiting the data persists there are millions of rows of calculations i even saved them as qvd files which improved loading time even more but still exploiting data is still slow, qlik sense is hosted on a server&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jan 2025 11:31:36 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502203#M3016</guid>
      <dc:creator>RyugaHideki</dc:creator>
      <dc:date>2025-01-21T11:31:36Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502210#M3017</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/313025"&gt;@RyugaHideki&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;This can be due to a few things, like number of distinct values, number of columns on tables etc etc&lt;/P&gt;
&lt;P&gt;What is the CPU spec and RAM size of the server?&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;How big is the application "In QMC - Apps - File Size(MB)"&lt;/P&gt;
&lt;P&gt;Regards Jandre&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jan 2025 12:18:08 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502210#M3017</guid>
      <dc:creator>JandreKillianRIC</dc:creator>
      <dc:date>2025-01-21T12:18:08Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502343#M3019</link>
      <description>&lt;P&gt;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/174995"&gt;@JandreKillianRIC&lt;/a&gt;&amp;nbsp;Server has 48gb of ram with a xeon 4216 cpu so it should be fairly enough i do believe the problem is the aggregations and the way i modeled the data cause one table (image) had over 5m rows and since it has 2 mores dimensions that don't figure in the fact table it keeps adding correspondance keys&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="RyugaHideki_0-1737530937775.png" style="width: 400px;"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/176717iD7AD36190BA62DE3/image-size/medium?v=v2&amp;amp;px=400" role="button" title="RyugaHideki_0-1737530937775.png" alt="RyugaHideki_0-1737530937775.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;size is 18.39mb again i think the problem is the modeling but i can't figure a way to reduce the problem the image table alone as 104k lines so it does kind of explain why the aggregations went to that number&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jan 2025 07:33:32 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502343#M3019</guid>
      <dc:creator>RyugaHideki</dc:creator>
      <dc:date>2025-01-22T07:33:32Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502349#M3020</link>
      <description>&lt;P&gt;Hi&amp;nbsp;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/313025"&gt;@RyugaHideki&lt;/a&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Agreed - This shouldn't be so "slow"...&amp;nbsp;&lt;/P&gt;
&lt;H3 class="LC20lb MBeuO DKV0Md"&gt;&lt;span class="lia-unicode-emoji" title=":thinking_face:"&gt;🤔&lt;/span&gt;&lt;/H3&gt;
&lt;P&gt;&lt;LI-WRAPPER&gt;&lt;/LI-WRAPPER&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jan 2025 08:05:40 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502349#M3020</guid>
      <dc:creator>JandreKillianRIC</dc:creator>
      <dc:date>2025-01-22T08:05:40Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502420#M3021</link>
      <description>&lt;P&gt;There exists different opinions about synthetic keys respectively tables - personally I regard such data-models as invalid. As far as there are "strange" results and/or a poor performance they should be resolved - ideally in the direction of a star-scheme which means having a single fact-table and n dimension-tables.&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jan 2025 13:13:37 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502420#M3021</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2025-01-22T13:13:37Z</dc:date>
    </item>
    <item>
      <title>Re: Application performance</title>
      <link>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502453#M3022</link>
      <description>&lt;P&gt;I am using a star scheme model but since there are a lot of dimensions introduced it did create more synthetic keys not much but still enough to slow it down since it's complicated the way I sort of solve it is by partitioning the application i put image on another app completely&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jan 2025 14:58:07 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Data-Movement-Streaming/Application-performance/m-p/2502453#M3022</guid>
      <dc:creator>RyugaHideki</dc:creator>
      <dc:date>2025-01-22T14:58:07Z</dc:date>
    </item>
  </channel>
</rss>

