Skip to main content
Announcements
Live today at 11 AM ET. Get your questions about Qlik Connect answered, or just listen in. SIGN UP NOW
cancel
Showing results for 
Search instead for 
Did you mean: 
jtompkins
Creator
Creator

Soft delete logic in a transactional data mart

We have implemented soft delete logic as proposed in the white paper, and it works fine with our state-oriented fact tables because they get rebuilt, but was wondering how best to implement this when using a transactional data mart. We added a last active date column to our model that gets a default sysdate value, and we then have a post-load etl that updates the deleted flag in the warehouse tables where the last active date is not the max date value. The problem is that these post-load updates do not get reflected in the Data Mart for transactional fact tables; the fact table will just retain its stale value because it thinks there is nothing to update or insert. Is there a suggested way to work around this, so that records that are marked as deleted in the warehouse are reflected as such in the data mart without having to essentially do the same work twice (once at the warehouse level and again at the data mart level)?

Thanks. 

Labels (1)
5 Replies
TimGarrod
Employee
Employee

If you update a HUB in a Post-Load step, make sure you also update the RUNNO_UPDATE column.   That is what Compose uses to detect changes for the transactional fact. 

 

You can use a built in "parameter"...

 

UPDATE HUB

SET custom_column = 'X',  RUNNO_UPDATE = &&1 

jtompkins
Creator
Creator
Author

Interesting. That is something we actually considered but had initially leaned against because it seemed unwise to meddle with Attunity's processing  logic, assuming it would introduce negative downstream effects.

TimGarrod
Employee
Employee

The data mart refresh process for dimensions and transactional facts is triggered by the RUNNO_UPDATE value.   (Compose keeps track of the last DWH runno processed for each data mart.). 

I think the only thing to be aware of is the number of updates in the logging tables would not be in-synch with what you've updated.  Now, you could increment the values in those tables in the DWH - but that gets a little trickier.   Just be aware that updating that value could trigger data mart processing downstream for any and all datamarts that leverage that entity.

jtompkins
Creator
Creator
Author

That &&1 parameter cannot be used in a stored procedure, correct? 

TimGarrod
Employee
Employee

To be honest, I haven't tried - but you may be able to have Compose pass it in (I think it just does a replace)...

Eg -

Exec myProc '&&1'
Or

Call myProc '&&1'