<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Loading huge data in qlikview in QlikView</title>
    <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497316#M686583</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Veena,&lt;BR /&gt;You're probably doing this already, just in case - make sure the QVD load is optimized.&lt;BR /&gt;And, on the requirements level...&amp;nbsp; In a similar situation, a client was OK with aggregating data by month, except for the last three months where they needed daily granularity.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;BR /&gt;Michael&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Tue, 17 Dec 2013 20:01:47 GMT</pubDate>
    <dc:creator>Anonymous</dc:creator>
    <dc:date>2013-12-17T20:01:47Z</dc:date>
    <item>
      <title>Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497310#M686574</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi All,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I have a huge data in my database. I have 500 million records in fact table for 2 years. Business needs atleast 2 years of data in the report and to the detailed sub product level.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I tried different approaches like document chaining. QVD size is coming to 15 GB and taking more than 5 hours to load. QVW is occupying 4 GB. I even tried incremental load but final qvd and qvw size is a concern. Can you please suggest any other approach which can handle big data.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks and Regards&lt;/P&gt;&lt;P&gt;Veena&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 15:49:28 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497310#M686574</guid>
      <dc:creator />
      <dc:date>2013-12-17T15:49:28Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497311#M686575</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I think document chaining is the only option, &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Create a landing QVW - like home QVW where you can select a month/Quarter/Semiannual/Annual. Then based on that take them to the QVW they should land on for details.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;So this way you can have parallel reloads for detail QVWs.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 15:56:51 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497311#M686575</guid>
      <dc:creator />
      <dc:date>2013-12-17T15:56:51Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497312#M686576</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Perhaps you could reduce the count of distinct values - besides not really required fields - to save space in qvd and RAM. Fields like a timestamp could be split in dates and times, also rowno() or combined keys need a lot of space.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;- Marcus&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 16:07:35 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497312#M686576</guid>
      <dc:creator>marcus_sommer</dc:creator>
      <dc:date>2013-12-17T16:07:35Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497313#M686578</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I tried document chaining in monthly level. Also to subject area like Product. &lt;/P&gt;&lt;P&gt;I am taking necessary columns only. 5 measures and 10 dimensions are there including country, vendor, product, buyer, sale date, ship date, delivery date, return date.&lt;/P&gt;&lt;P&gt;There are no columns with timestamp&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 16:24:05 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497313#M686578</guid>
      <dc:creator />
      <dc:date>2013-12-17T16:24:05Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497314#M686580</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi Veena,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Did you try with incremental load? I think that all transformational functions must stay before save in qvd, after that, you can load data "faster" and show for your users...&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;-JFlorian&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 19:33:03 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497314#M686580</guid>
      <dc:creator>javier_florian</dc:creator>
      <dc:date>2013-12-17T19:33:03Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497315#M686581</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P style="margin-bottom: .0001pt;"&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;Hi,&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;When Large Data,&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;In an organisation that has masses of transactional data going back 2 years, &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;you don’t want to have to load all 2 years including the current years data every night. &lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;One solution to this is below:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;Create QVD Files for each year you wish to analyse. These can be created dynamically based on how many&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;years you wish to look at. Suppose you are looking two years,The script below show how to create a simple QVD file based on a moving date &lt;/SPAN&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;clause.&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt; &lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;Year-Two:&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;Load *&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;FROM C:\Data.xls (biff, embedded labels, table is [Core Data$])&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;Where Year(Date) = YEAR(TODAY())-2;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;Store 'Year-Two' into C:\Year-Two.qvd;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-size: 9.0pt; font-family: 'Cambria','serif';"&gt;&lt;STRONG&gt;Drop Table 'Year-Two';&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 19:53:40 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497315#M686581</guid>
      <dc:creator />
      <dc:date>2013-12-17T19:53:40Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497316#M686583</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Veena,&lt;BR /&gt;You're probably doing this already, just in case - make sure the QVD load is optimized.&lt;BR /&gt;And, on the requirements level...&amp;nbsp; In a similar situation, a client was OK with aggregating data by month, except for the last three months where they needed daily granularity.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Regards,&lt;BR /&gt;Michael&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 17 Dec 2013 20:01:47 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497316#M686583</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2013-12-17T20:01:47Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497317#M686584</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;&lt;BR /&gt;Hi Michael,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I am already loading aggregated data. But as users need weekly or monthly aggregated data, we were not able to reduce the load. (~200M records we have after aggregation because of the dimension level data). Is there any other approach you can suggest with which we can handle this huge data.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;~Veena&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 18 Dec 2013 04:37:10 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497317#M686584</guid>
      <dc:creator />
      <dc:date>2013-12-18T04:37:10Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497318#M686586</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;We are already doing incremental load in qvds. But qvw load is also taking time. And qvw size is also a concern.&lt;/P&gt;&lt;P&gt;&lt;BR /&gt; &lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 18 Dec 2013 04:38:31 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497318#M686586</guid>
      <dc:creator />
      <dc:date>2013-12-18T04:38:31Z</dc:date>
    </item>
    <item>
      <title>Re: Loading huge data in qlikview</title>
      <link>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497319#M686587</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;I have ~200M records for each year and already splitting qvds. But it is a serial load when I load to qvw. Even if splitted qvd approach didn't help&amp;nbsp; to reduce reload time.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Wed, 18 Dec 2013 04:42:30 GMT</pubDate>
      <guid>https://community.qlik.com/t5/QlikView/Loading-huge-data-in-qlikview/m-p/497319#M686587</guid>
      <dc:creator />
      <dc:date>2013-12-18T04:42:30Z</dc:date>
    </item>
  </channel>
</rss>

