<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: Incremental load with Clarity tables in Archived Groups</title>
    <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120803#M279</link>
    <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;That's not quite true. In most of the important tables, there is an "Update Date" column that you can use to determine when the row was most recently extracted. Most of the important tables also have the "row update tracking" feature turned on. (This is the feature that Epic referred to saying that it impacts performance badly.)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The Epic Cogito DW uses this "row update tracking" table (CR_STAT_ALTER) as its mechanism to determine what to extract. So you can rest assured that the most important info is probably already tracked.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Now, having said that, I do not believe that you need to go this route. Here are some other things to think about&lt;BR /&gt;1) do you really need to do incremental? You might be surprised how quickly a full refresh goes. &lt;/P&gt;&lt;P&gt;2) do you really need the QVD layer? If you have the warehouse and some good views, can you go against those directly?&lt;BR /&gt;3) if you do need incremental, can you do a "reasonable lookback" combined with a weekly or monthly full refresh?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;You're headed down a long and expensive path if you start building out an incremental ETL from clarity. Consider alternatives before proceeding. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Dave&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
    <pubDate>Thu, 21 Jul 2016 18:58:30 GMT</pubDate>
    <dc:creator />
    <dc:date>2016-07-21T18:58:30Z</dc:date>
    <item>
      <title>Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120802#M278</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hi Champs,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;Our organization is in process of moving from Meditech to Epic.&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;Currently most of the QVD generators are using Incremental load based on the last 'Modified Date'.&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;But from our Epic guys I heard that there is No row modified datetime in clarity tables. They say that they can turn this feature On for some of table but it impacts the performance badly.&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;Does anyone have solution for this??&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;Appreciate you help.&lt;/SPAN&gt; &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;Thanks&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;SPAN style="font-family: 'Arial','sans-serif'; font-size: 10pt;"&gt;&lt;A href="https://community.qlik.com/group/1286" target="_blank"&gt;Qlik Epic Developers Group&lt;/A&gt;‌&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Tue, 22 Jul 2025 14:29:37 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120802#M278</guid>
      <dc:creator>neelamsaroha157</dc:creator>
      <dc:date>2025-07-22T14:29:37Z</dc:date>
    </item>
    <item>
      <title>Re: Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120803#M279</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;That's not quite true. In most of the important tables, there is an "Update Date" column that you can use to determine when the row was most recently extracted. Most of the important tables also have the "row update tracking" feature turned on. (This is the feature that Epic referred to saying that it impacts performance badly.)&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The Epic Cogito DW uses this "row update tracking" table (CR_STAT_ALTER) as its mechanism to determine what to extract. So you can rest assured that the most important info is probably already tracked.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Now, having said that, I do not believe that you need to go this route. Here are some other things to think about&lt;BR /&gt;1) do you really need to do incremental? You might be surprised how quickly a full refresh goes. &lt;/P&gt;&lt;P&gt;2) do you really need the QVD layer? If you have the warehouse and some good views, can you go against those directly?&lt;BR /&gt;3) if you do need incremental, can you do a "reasonable lookback" combined with a weekly or monthly full refresh?&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;You're headed down a long and expensive path if you start building out an incremental ETL from clarity. Consider alternatives before proceeding. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Dave&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 21 Jul 2016 18:58:30 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120803#M279</guid>
      <dc:creator />
      <dc:date>2016-07-21T18:58:30Z</dc:date>
    </item>
    <item>
      <title>Re: Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120804#M280</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hey David,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for your time and sharing your valuable knowledge.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;1. As I said we are in process of implementing Epic, so practically I have no idea how much time it would take to do a full reload. So this could be one of the possibility.&lt;/P&gt;&lt;P&gt;2. Yes we would need QVD layer because the current system has multiple layer of QVDs for extraction &amp;amp; Transformation and we have to map these existing applications to Epic data once Live.&lt;/P&gt;&lt;P&gt;3. Weekly or Monthly refresh is a possibility but then I have to go back to my team and discuss.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;I would get back once decision is made.&lt;/P&gt;&lt;P&gt;Thanks a ton...&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 21 Jul 2016 19:45:21 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120804#M280</guid>
      <dc:creator>neelamsaroha157</dc:creator>
      <dc:date>2016-07-21T19:45:21Z</dc:date>
    </item>
    <item>
      <title>Re: Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120805#M281</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;David brings up some good points, but I thought I would share our experience as well.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We've had Epic since 2006, which was before some of the new features that make it easier to track changes in Clarity for the Cogito Data Warehouse. We had a homegrown EDW that needed to incrementally load from the larger tables.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;The path they took, was our Clarity admin tacked on a custom column to any tables without an update_date field that simply displayed the date of the extract from Chronicles. &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Those fields have been very helpful for us now with Qlikview to drive incrementals from. Once we go live with the Cogito Data Warehouse we plan to investigate using CR_STAT_ALTER, which to our understanding will help with deletes and avoid a costly primary key inner join on tables where rows can be deleted.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;If you do go the incremental route, I would suggest using paramaterized external functions to handle your incremental loads rather than building an incremental script for each individual table. It helped make our process much smoother and easy to maintain.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 22 Jul 2016 12:21:54 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120805#M281</guid>
      <dc:creator>dclark0699</dc:creator>
      <dc:date>2016-07-22T12:21:54Z</dc:date>
    </item>
    <item>
      <title>Re: Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120806#M282</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;Hey Donnie,&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks for sharing your experience.&lt;/P&gt;&lt;P&gt;I would surely consider your point of using parameterized external function.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;Thanks&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Fri, 22 Jul 2016 12:39:54 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120806#M282</guid>
      <dc:creator>neelamsaroha157</dc:creator>
      <dc:date>2016-07-22T12:39:54Z</dc:date>
    </item>
    <item>
      <title>Re: Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120807#M283</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;+1 for David's "other things to think about" comments.&amp;nbsp; And agree, we have had a good experience leveraging CR_STAT_ALTER to keep our objects in sync (even CLARITY_TDL_TRAN).&amp;nbsp; &lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 28 Jul 2016 16:33:12 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120807#M283</guid>
      <dc:creator>jstemig1</dc:creator>
      <dc:date>2016-07-28T16:33:12Z</dc:date>
    </item>
    <item>
      <title>Re: Incremental load with Clarity tables</title>
      <link>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120808#M284</link>
      <description>&lt;HTML&gt;&lt;HEAD&gt;&lt;/HEAD&gt;&lt;BODY&gt;&lt;P&gt;We use QVD files as our data layer (Clairty to QlikView QVDs...not other data tool). &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We extract tables in their raw form from Clarity and then process them into our staging and final stage environment using QlikView (and more QVD files).&amp;nbsp; We found this to much faster and lower maintenance than views in Clarity (which we tried first). &lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;For the dimensional type data, we do complete extracts daily.&amp;nbsp; For many of the fact based tables, we do not do a FULL reload, but we reload any of the transactions in the last 6 months each night.&amp;nbsp; Financials are the exception as our extracts are incremental based on Posted Dates.&amp;nbsp; Each day of the week, we do a full reload of a past year for all transactional data sets (including financials) to catch any of those changes.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We had issues with the incremental loads as we encountered scenarios where the flags were not set appropriately in Clarity (in fact, we have seen examples where EPIC did not flag correctly to push changes to Clarity).&amp;nbsp; With the above threads, it sounds like these issues may be less prevalent today.&lt;/P&gt;&lt;P&gt;&lt;/P&gt;&lt;P&gt;We do not use Cogito Kaboodle as our data warehouse, but as a supplement to it.&amp;nbsp; If we were going live today, we would likely give Kaboodle a serious look...however it is (yet) worth rebuilding our entire data warehouse.&lt;/P&gt;&lt;/BODY&gt;&lt;/HTML&gt;</description>
      <pubDate>Thu, 04 Aug 2016 19:50:36 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Archived-Groups/Incremental-load-with-Clarity-tables/m-p/1120808#M284</guid>
      <dc:creator>Anonymous</dc:creator>
      <dc:date>2016-08-04T19:50:36Z</dc:date>
    </item>
  </channel>
</rss>

