Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Connect 2026 Agenda Now Available: Explore Sessions
cancel
Showing results for 
Search instead for 
Did you mean: 
neelamsaroha157
Specialist II
Specialist II

Incremental Load

Hi Experts,

Its been an year since we moved to Epic and now we plan to implement the incremental logic to our clarity table extracts. The issue is that we do not have the functionality of adding the CR_UPD_DATE (row update time) enabled.

My question is what is the best way to implement the Incremental load on the Clarity tables.

1. Do we have to turn on the row update time?

     if yes, then the row update column is not directly populated in the base tables BUT a separate table is created with the primary key and the time...is this the only method...or there is any work around??

2. Is there any other way to implement it??

Thanks for your anticipated feedback and time!!!!!

Labels (1)
1 Solution

Accepted Solutions
tbrancazio
Luminary Affiliate
Luminary Affiliate

We follow a similar process as described in the video Basic Incremental Load - Qlik Sense - YouTube

on our professional billing and several hospital billing fact tables being extracted daily from Clarity and into QVDs that contain hundreds of millions of records. 

We hit the database connection each morning to pull the past 2 days of activity and concatenate that data to the existing QVD for each table.  We go back two days in case there were any clarity load failures, but our environment is very stable in regards to the nightly ETL.  Under that load, we concatenate the existing fact table QVD with a where clause of Where not Exists(ID_FIELD);  Example below. 

We run a full data load once/week on Sunday to populate the QVDs; somewhat as a precaution. 

We set this up over a year ago and there may be better ways of going about it now, but it has been working for us. 

hope this helps

Thanks

FactTable:

LOAD

"IDKEY"

,"NAME"

,"LOCATION"

,"AMOUNT";

       

SQL SELECT

"IDKEY"

,"NAME"

,"LOCATION"

,"AMOUNT";

FROM Clarity.dbo."ClarityTableName"

Where DATE >= DATEADD(DD,-2,DATEDIFF(DD, 0, GETDATE()));

Concatenate

LOAD

"IDKEY"

,"NAME"

,"LOCATION"

,"AMOUNT"

FROM [lib:// QVD Path (service_account)/FactTable.qvd]

(qvd)

where not Exists(IDKEY);

STORE FactTable INTO [lib://Extract QVD Path (service_account)/FactTable.qvd]

(qvd);

DROP Table FactTable;

View solution in original post

7 Replies
neelamsaroha157
Specialist II
Specialist II
Author

Has anyone worked on incremental load from Epic clarity table???

joe_warbington
Partner - Creator III
Partner - Creator III

Hi Neelam - I just posed the question on the Epic Qlik Group on LinkedIn. I know this is common, so I'm hoping someone will join in. https://www.linkedin.com/groups/8269572/

Dalton ( drr‌ ) did a session on incremental loads that you may want to check out as well:

A couple of related posts:

Incremental Update of Data with Update Field

Check for existing .qvd file in load script.

neelamsaroha157
Specialist II
Specialist II
Author

Thank you jaw‌ for sharing these valuable resources. I'll follow the LinkedIn for more discussion.

m2r4miller
Contributor III
Contributor III

We've been on Epic since 2010 and using QlikView since around 2012.

Every day we pull multiple data sets from Clarity into QVDs for applications for Lab, Appointments and Scheduling, Financial apps, etc. Some of our QVDs contain several million rows and we have not seen a need to implement any incremental load logic.

We do limit our QVD builds by date parameters, for example only going back 2 years to limit the number of records loaded, however.

I would think that due to the nature of Clarity, incremental loads are going to be difficult to implement because I don't think you can trust Clarity to always properly flag records that are being added vs. updated, etc.

I'm sure there are others out there that have tackled the problem of incremental loads from Clarity, and I also would be interested in learning how the idiosyncrasies of Clarity were addressed.

neelamsaroha157
Specialist II
Specialist II
Author

Thank you Mark for your feedback.

I agree your point that it might be difficult to perform incremental loads because the stored procedures that are turned on, provides you with the primary key of the table and the row update time of that key. Row update time is not added as a column to the original tables (from what I have learnt from our Clarity team) and that's where the challenge is.

Waiting to hear from other experts..

tbrancazio
Luminary Affiliate
Luminary Affiliate

We follow a similar process as described in the video Basic Incremental Load - Qlik Sense - YouTube

on our professional billing and several hospital billing fact tables being extracted daily from Clarity and into QVDs that contain hundreds of millions of records. 

We hit the database connection each morning to pull the past 2 days of activity and concatenate that data to the existing QVD for each table.  We go back two days in case there were any clarity load failures, but our environment is very stable in regards to the nightly ETL.  Under that load, we concatenate the existing fact table QVD with a where clause of Where not Exists(ID_FIELD);  Example below. 

We run a full data load once/week on Sunday to populate the QVDs; somewhat as a precaution. 

We set this up over a year ago and there may be better ways of going about it now, but it has been working for us. 

hope this helps

Thanks

FactTable:

LOAD

"IDKEY"

,"NAME"

,"LOCATION"

,"AMOUNT";

       

SQL SELECT

"IDKEY"

,"NAME"

,"LOCATION"

,"AMOUNT";

FROM Clarity.dbo."ClarityTableName"

Where DATE >= DATEADD(DD,-2,DATEDIFF(DD, 0, GETDATE()));

Concatenate

LOAD

"IDKEY"

,"NAME"

,"LOCATION"

,"AMOUNT"

FROM [lib:// QVD Path (service_account)/FactTable.qvd]

(qvd)

where not Exists(IDKEY);

STORE FactTable INTO [lib://Extract QVD Path (service_account)/FactTable.qvd]

(qvd);

DROP Table FactTable;

neelamsaroha157
Specialist II
Specialist II
Author

Thank you Tom for your input, it is very helpful and great idea.

While reading your reply, my question was that there are some non-ID fields that does change over time like the coding, patient class which could not be captured only using the primary key fields. But having the full load on the weekend can address that issue, although we might have to wait a week to get the most updated data in that case. I was wondering if you have come across the similar scenarios.

Thanks a ton again for your time!