Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello everyone,
Video Demonstration: https://youtu.be/jA1dnwi--UQ
Problem Overview
I am encountering an issue where accumulated SCD Type 2 history in the Data Warehouse (DWH) is completely wiped out after performing a Full Load in Qlik Compose. This happens in a specific scenario where the source system drops and recreates tables during software updates, changing the Object ID and requiring a re-sync.
The Scenario
Source: MS SQL Server. Table w/o PK, Index avalable - use MS-CDC
CDC/Ingestion: Qlik Replicate with Global Transformation rules to capture Commit Timestamp and Operation Type. and add two field
Operation
CASE
WHEN $AR_H_STREAM_POSITION = ''
THEN NULL
ELSE operation_indicator("d", "u", "i" )
END
OperationDateTime
CASE WHEN
operation_indicator("Y","Y","Y") = 'Y'
THEN
$AR_H_COMMIT_TIMESTAMP
ELSE NULL END
DWH Automation: Qlik Compose with a field (e.g., data) set to History Type 2 to track changes over time.
Steps to Reproduce
Initial Setup: Perform an initial full load to DWH. After that make several updates (e.g., update2, update3, update4) to build a history and run CDC task in Qlik Compose of the record in the DWH.
Verify History: At this stage, the DWH correctly shows multiple historical rows with valid From Date and To Date intervals for value in data
Source Change: Simulate a source software update where the table is renamed/recreated, resulting in a new Object ID.
Full Load: Re-run the Full Load task in Qlik Compose to synchronize with the "new" source table.
The Issue (Actual Result)
After the Full Load finishes, all intermediate historical records (the update versions) are deleted from the DWH table. Only the very first record (initial load) and the latest record from the recent Full Load remain. All historical context in between is lost.
Questions for the Community
Is this "history wiping" expected behavior during a Full Load for SCD Type 2 tables in Qlik Compose?
How can I perform a Full Load from a recreated source table while preserving the existing historical rows already stored in the DWH?
Are there specific History Settings or Task Settings in Compose that allow "appending" or "merging" new Full Load data into existing Type 2 history without deleting intermediate states?
ok - so my suggestion was backwards. Sorry - doing this on text and not in person. So all the header timestamps are using local time - so you should set Compose to use local server time (not UTC).
When you run a full load in Compose - it doesn't truncate and reload data. It performs a comparison between the source and the target data.
You state the 'ObjectID' changes. I assume you mean the internal database object_id ? (like in SQL Server - which wouldn't matter to Compose).
When this occurs as you stated, you need to reload Replicate due to the objectid changing in the source system.
But for Compose its performing a SELECT * FROM MYTABLE - it has no knowledge of the object ids in the source nor in the 'landing' where it processes data from. if you run a FULL LOAD task in Compose it does not truncate or typically wipe history. Now - where there could be a difference is if you mapped the FROM DATE field from your source..
Where Compose does 'wipe' history - if there is back-dated data in there, then Compose will re-correct / re-apply the history for that specific KEY (natural key).
If that is what you are doing with the operation date - I'm curious why you would not leave the FROM-DATE and it would be mapped automatically to the from date that Replicate provides during CDC processing, and mapped from the runtime of the FULL LOAD task during full load processing
Hello Tim,
Thank you for your attention to my issue.
Yes i mean object_id for new tables in source MS SQL system.
Can you explane little bit more about FROM-DATE. May be some example.
in your Compose mappings do you have the FD mapped ? FD = From date - which is the from for the type 2 processing. Typically this is NOT mapped in Compose and a full load uses the 'current date' - and thus does not remove history. However, if you have it mapped and its getting a null or back-date that could explain why you 'lose' history
How can I track deletions in this case? Should I just use a soft delete approach like in this document?
https://community.qlik.com/cyjdu72974/attachments/cyjdu72974/qlik-support-knowledge-base/5312/3/Qlik...
Why i'm use operationdatetime - because in some cases in source system software not update record. Delete it and insert new one with the same id.
Here is my mapping
What does the model look like ? Did you set this to be a type 2 structure in the model. ?
the good news - you haven't mapped the FD date. So you are doing the default behavior. So Compose should not be removing your 'type 2' history here.
I really don't want to waste your precious time, but here is the full video showing what I'm setting up and how it turns out in the end. Video Demonstration: https://youtu.be/jA1dnwi--UQ
model in DWH is
Ok - so the issue is a date / datetime issue between your source and your Compose server is my guess.
Note in your video - the Type 2 time stamps are 2025-12-30 14:36
THEN, after you processed the date, the FD for the new record now is - 12:40
This is the back-dating behavior in Compose. Where for a full load, we don't know what the time of change is - so we use a project setting to determine whether to use UTC or local servertime. I suspect you have a setting that does not match how Replicate operates. (Replicate typically uses UTC time for the header dates).
What do you have set in Compose for the project settings for the below : Current Time Convention . Replicate FROM dates are typically in UTC - so you should make sure you have UTC selected in your project. Otherwise, Compose is seeing an earlier FROM DATE - and thus thinking you are reprocessing data and it wipes out 'future' history.
Qlik Replicate and Qlik Compose installed on the same machine.
All Database located in the same DB Server.
In Compose project Current time convention set to Current time in UTC (by default)
I try add few update's and what i see in t1_ct table
header__timestamp - 2026-01-06 19:55:36.623000 --this is my localtime
--but operationdatetime has 2026-01-06 21:55:27.947000 it's my local time UTC+2
operationdatetime expresion is
CASE WHEN
operation_indicator("Y","Y","Y") = 'Y'
THEN
$AR_H_COMMIT_TIMESTAMP
ELSE NULL END
I don't know why in header__timestamp i see localtime not UTC . I try fix it
Ok, let's run CDC task in Compose.