<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Chunk QVD based on Size limit in App Development</title>
    <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088718#M89229</link>
    <description>&lt;P&gt;Still shouldn't take too long to check them all with, say, a million rows.&lt;/P&gt;
&lt;P&gt;I'm kind of confused, in concept, by a scenario where you might have 20-30 QVDs that exceed 1GB in an environment that doesn't allow files larger than 1GB. This seems like an excessive amount of data relative to the restriction.&lt;/P&gt;</description>
    <pubDate>Wed, 28 Jun 2023 12:57:29 GMT</pubDate>
    <dc:creator>Or</dc:creator>
    <dc:date>2023-06-28T12:57:29Z</dc:date>
    <item>
      <title>Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088472#M89211</link>
      <description>&lt;P&gt;Hi All,&lt;/P&gt;
&lt;P&gt;Is it possible to chunk the QVD size within the STORE command in Qliksense?&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Ex: If a particular table tries to create a QVD of size more than 1Gb, is it possible to write to a new QVD when the size exceeds &amp;nbsp;within the STORE command dynamically?&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 02:33:51 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088472#M89211</guid>
      <dc:creator>sri94aa</dc:creator>
      <dc:date>2023-06-28T02:33:51Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088549#M89215</link>
      <description>&lt;P&gt;AFAIK - no. You may build an own sub-routine for such task but honestly I think you&amp;nbsp;creates more problems with such an approach as you would solving.&lt;/P&gt;
&lt;P&gt;IMO better would be to slice the data in regard to the content and not to the file-size or the number or records. Quite common is a slicing into YYYYMM chunks and/or countries/companies/products or similar stuff - within appropriate incremental approaches in a multi-tier data-architecture.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 08:42:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088549#M89215</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2023-06-28T08:42:28Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088693#M89223</link>
      <description>&lt;P&gt;Cool, thanks for the reply, we have a restriction to keep the size less than 1Gb, but not sure how it can be done based on the YYYYMM, as I will not be aware of the size how the chunks are made with YYYYMM or countries or companies.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 12:33:49 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088693#M89223</guid>
      <dc:creator>sri94aa</dc:creator>
      <dc:date>2023-06-28T12:33:49Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088699#M89224</link>
      <description>&lt;P&gt;You could add a row number field to your data and then chunk based on that - file sizes should be relatively consistent for a given number of rows. This will increase the file size a bit (one more field), but might be worth doing if you don't have another field you can chunk on. &lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 12:39:19 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088699#M89224</guid>
      <dc:creator>Or</dc:creator>
      <dc:date>2023-06-28T12:39:19Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088706#M89226</link>
      <description>&lt;P&gt;Thanks for the response, do we have function to get the memory of a row?&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 12:46:34 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088706#M89226</guid>
      <dc:creator>sri94aa</dc:creator>
      <dc:date>2023-06-28T12:46:34Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088713#M89227</link>
      <description>&lt;P&gt;Not that I'm aware of, but it should be easy enough to work out by dumping a set number of rows into a file and checking its size.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 12:53:29 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088713#M89227</guid>
      <dc:creator>Or</dc:creator>
      <dc:date>2023-06-28T12:53:29Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088715#M89228</link>
      <description>&lt;P&gt;ok, but the problem here is it's not for single QVD, but we have around 20-30 QVDs that has different type &amp;amp; number of columns.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 12:54:56 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088715#M89228</guid>
      <dc:creator>sri94aa</dc:creator>
      <dc:date>2023-06-28T12:54:56Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088718#M89229</link>
      <description>&lt;P&gt;Still shouldn't take too long to check them all with, say, a million rows.&lt;/P&gt;
&lt;P&gt;I'm kind of confused, in concept, by a scenario where you might have 20-30 QVDs that exceed 1GB in an environment that doesn't allow files larger than 1GB. This seems like an excessive amount of data relative to the restriction.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 12:57:29 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088718#M89229</guid>
      <dc:creator>Or</dc:creator>
      <dc:date>2023-06-28T12:57:29Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088727#M89231</link>
      <description>&lt;P&gt;It depends mainly on the kind of data how much storage will be consumed. With a lot of distinct field-values the storage-size will behave similar to a sql-table and increasing/decreasing quite linear. But with mainly redundant field-values it's different and ten times of records might only need twice of space.&lt;/P&gt;
&lt;P&gt;Beside the challenges to size the qvd's near to 1 GB it could become quite tricky to handle them afterwards. The qvd-filenames might be by the generation just get a continuous number - but how many will exists and which one contains which data?&lt;/P&gt;
&lt;P&gt;Like above hinted I think it's not really expedient and a slicing to the content would be more suitable. This must not mandatory be a horizontally chunk related to the records else also a vertically one and sliced by fields.&lt;/P&gt;
&lt;P&gt;Only with a content-slicing you will be able to use an incremental logic and dividing the tasks to run in parallel and/or into different time-frames and also to use them within appropriate following data-models and report-layers - without accessing all data and picking the needed ones&amp;nbsp;there.&lt;/P&gt;
&lt;P&gt;Further I suggest to review the data within the qvd's and keeping only needed fields without any row-specific formatting and without any record-id's and an optimizing of the field-cardinality, for example by splitting timestamps into dates and times.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 13:21:12 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088727#M89231</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2023-06-28T13:21:12Z</dc:date>
    </item>
    <item>
      <title>Re: Chunk QVD based on Size limit</title>
      <link>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088812#M89244</link>
      <description>&lt;P&gt;Nope, it is kind of a restriction applied recently by our third party team handling the infra, so we have to implement the changes.&lt;/P&gt;</description>
      <pubDate>Wed, 28 Jun 2023 15:20:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/App-Development/Chunk-QVD-based-on-Size-limit/m-p/2088812#M89244</guid>
      <dc:creator>sri94aa</dc:creator>
      <dc:date>2023-06-28T15:20:28Z</dc:date>
    </item>
  </channel>
</rss>

