Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Context
---------
We are working on a Qlik application for a justice-related client with a complex, historized data model.
The core entities are:
- Affair
- Party
- Event tables (e.g. Judgement, Ad Acta, etc.)
Each event:
- is linked to one Affair
- is linked to one Party
- an Affair can have multiple Parties
- different Events of the same Affair can concern different Parties
Example:
- Affair A1
* Parties P1 and P2
* Ad Acta → Party P1
* Judgement → Party P2
---------
Navigation requirement
---------
- If the user filters on an Affair, all Events related to that Affair must be visible.
- If the user filters on a specific Event:
-> the concerned Party must be visible
-> but also all other Events of the same Affair, even if they concern a different Party.
To support this, we considered a link table approach, containing:
- a “real” link (Event ↔ Party)
- and a “technical/fictitious” link (Event ↔ Affair, without Party) to allow navigation across Events of the same Affair.
---------
Temporal / historization requirement
---------
All tables in the model are historized:
- Each table contains ValidFrom and ValidTo fields
- Multiple records exist for the same business key to track changes over time
Business requirement:
- The user must be able to select a “situation date” (e.g. 26/01/2025)
- Qlik must then only consider records that are valid at that date, across all tables and relationships: ValidFrom <= SituationDate <= ValidTo (or open-ended)
-> This temporal logic applies to:
Affairs
Parties
Events
Relationship tables
---------
What we tried / considered
---------
- A disconnected “Situation Date” table combined with Set Analysis in all expressions
→ rejected due to maintainability, performance, and risk of inconsistency.
- Linking a central Situation Date table to all historized tables using IntervalMatch
→ leads to loops and ambiguous paths in the associative model.
- Centralizing validity dates in the link table
→ would require duplicating relationships for each validity period, leading to data explosion and unclear semantics between entity validity and relationship validity.
---------
Current challenge
---------
We are unable to find a modeling pattern that:
- avoids loops and synthetic keys,
- keeps the temporal filtering centralized,
- preserves correct navigation between Affair / Party / Events,
- and does not rely on heavy Set Analysis in every chart.
---------
Provided Material
---------
To illustrate the issue, we are providing:
- a sample data model
- including Affair, Party, Event tables
- historized records with ValidFrom / ValidTo
- and link tables (real and technical relationships)
This sample model reproduces the navigation and temporal challenges described above.
---------
Questions
---------
1/ Is there an officially recommended modeling pattern in Qlik for:
- multiple historized fact tables,
- historized relationships,
- and a global “as-of date” filtering requirement?
2/ Is this type of model fully supported by the Qlik associative engine, or are there known limitations that require functional or UX compromises?
3/ Would Qlik recommend:
- reducing the number of historized tables,
- isolating temporal logic in a specific layer,
- other solutions ?
Thank you in advance.
Recommended is to design the data-model in the direction of a star-scheme which means to merge all facts into a single fact-table by harmonizing the field-names and data-structures as much as possible and within n extra fields are the appropriate source- and type-information stored.
This is mainly done with concatenate-statements after the raw-sources were enriched, filled, cleaned, prepared per joins/mappings from the other facts/dimensions. That the facts are not completely synchron and/or containing a different granularity is neither technology nor logically a showstopper. Within the most scenarios this worked very well.
Benefits are the simplicity of the model and the script, avoiding all the association-trouble with synthetic keys and/or circular loops and providing a good performance (usually much better as with link-table approaches).
Beside this I'm not sure of you need really a resolution of the n from-to dates because each source has an event-date which should relate to them. If so the from-to information might be outsourced into dimension-tables. In the case they should be resolved to dedicated dates I would apply an internal while-loop instead of intervallmatch-loads. It means something like:
load *, date(from + iterno() - 1) as Date
from X while from + iterno() - 1 <= to;