Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi Qlik Community,
I'm facing an issue when trying to join two tables. One table contains grouped data by product, while the other has similar data but split further by EAN codes.
The problem occurs when I try to select a catalog name — my sales show correctly, but purchases disappear. I believe this is caused by the granularity mismatch due to the EAN-level breakdown.
Here's a simplified example of my data model. In the real scenario, there are more fields and dimensions, but the issue is visible already here.
I’d appreciate any guidance on how to connect these two tables properly, I need EAN in other sheet and I cant delate it.
My test data:
PURCHASES:
LOAD * INLINE [
"SHOP ID", YEAR, MONTH, "GROUPING LVL 4", PURCHASES
01, 2025, Jan, 1001, 3200
02, 2025, Jan, 2001, 4100
01, 2025, Feb, 1001, 2800
02, 2025, Feb, 2001, 5300
01, 2025, Mar, 1001, 4700
02, 2025, Mar, 2001, 3500
];
SALES:
LOAD * INLINE [
"SHOP ID", YEAR, MONTH, "START DATE", "END DATE", "GROUPING LVL 4", "GROSS SALES", "CATALOG NO", "CATALOG NAME", EAN
01, 2025, Jan, 2025-01-05, 2025-01-15, 1002, 1500, GZ001, WINTER PROMO, 5901234567890
02, 2025, Jan, 2025-01-05, 2025-01-15, 2002, 1700, GZ001, WINTER PROMO, 5901234567891
03, 2025, Jan, 2025-01-05, 2025-01-15, 2002, 1800, GZ001, WINTER PROMO, 5901234567892
01, 2025, Feb, 2025-02-01, 2025-02-10, 1002, 2400, GZ002, VALENTINE DEALS, 5901234567890
02, 2025, Feb, 2025-02-01, 2025-02-10, 2002, 2200, GZ002, VALENTINE DEALS, 5901234567891
03, 2025, Feb, 2025-02-01, 2025-02-10, 2002, 2800, GZ002, VALENTINE DEALS, 5901234567892
01, 2025, Mar, 2025-03-10, 2025-03-20, 1002, 3000, GZ003, SPRING OFFER, 5901234567890
02, 2025, Mar, 2025-03-10, 2025-03-20, 2002, 2800, GZ003, SPRING OFFER, 5901234567891
03, 2025, Mar, 2025-03-10, 2025-03-20, 2002, 2400, GZ003, SPRING OFFER, 5901234567892
];
GROUPING_DICTIONARY:
LOAD * INLINE [
"GROUPING LVL 4", "GROUPING LVL 1"
1001, 0001
1002, 0001
2001, 0002
2002, 0002
];
SHOP_DICTIONARY:
LOAD * INLINE [
"SHOP ID", "SHOP NAME"
01, SHOP A
02, SHOP B
03, SHOP C
];
What I have:
I need the data to a grouping level 1 per year and per month, so Qlik should sum it up nicely, even though it doesn't.
The Problem here isnt your Year and Month, but to do with your CATALOG NAME coming from your Sales Table.
You do not have a right link for certain grouping combinations to link from purchases to Sales in your current Data Model.
You want to do an applymap() based on grouping level 1, and drop level 4 from one of your sales/purchases tables..
MAP_GROUPING_DICTIONARY:
Mapping LOAD * INLINE [
"GROUPING LVL 4", "GROUPING LVL 1"
1001, 0001
1002, 0001
2001, 0002
2002, 0002
];
PURCHASES:
Load "SHOP ID", YEAR, MONTH,PURCHASES,
Applymap('MAP_GROUPING_DICTIONARY',"GROUPING LVL 4") as "GROUPING LVL 1"
;
LOAD * INLINE [
"SHOP ID", YEAR, MONTH, "GROUPING LVL 4", PURCHASES
01, 2025, Jan, 1001, 3200
02, 2025, Jan, 2001, 4100
01, 2025, Feb, 1001, 2800
02, 2025, Feb, 2001, 5300
01, 2025, Mar, 1001, 4700
02, 2025, Mar, 2001, 3500
];
SALES:
Load *,
Applymap('MAP_GROUPING_DICTIONARY',"GROUPING LVL 4") as "GROUPING LVL 1";
LOAD * INLINE [
"SHOP ID", YEAR, MONTH, "START DATE", "END DATE", "GROUPING LVL 4", "GROSS SALES", "CATALOG NO", "CATALOG NAME", EAN
01, 2025, Jan, 2025-01-05, 2025-01-15, 1002, 1500, GZ001, WINTER PROMO, 5901234567890
02, 2025, Jan, 2025-01-05, 2025-01-15, 2002, 1700, GZ001, WINTER PROMO, 5901234567891
03, 2025, Jan, 2025-01-05, 2025-01-15, 2002, 1800, GZ001, WINTER PROMO, 5901234567892
01, 2025, Feb, 2025-02-01, 2025-02-10, 1002, 2400, GZ002, VALENTINE DEALS, 5901234567890
02, 2025, Feb, 2025-02-01, 2025-02-10, 2002, 2200, GZ002, VALENTINE DEALS, 5901234567891
03, 2025, Feb, 2025-02-01, 2025-02-10, 2002, 2800, GZ002, VALENTINE DEALS, 5901234567892
01, 2025, Mar, 2025-03-10, 2025-03-20, 1002, 3000, GZ003, SPRING OFFER, 5901234567890
02, 2025, Mar, 2025-03-10, 2025-03-20, 2002, 2800, GZ003, SPRING OFFER, 5901234567891
03, 2025, Mar, 2025-03-10, 2025-03-20, 2002, 2400, GZ003, SPRING OFFER, 5901234567892
];
SHOP_DICTIONARY:
LOAD * INLINE [
"SHOP ID", "SHOP NAME"
01, SHOP A
02, SHOP B
03, SHOP C
];
The Problem here isnt your Year and Month, but to do with your CATALOG NAME coming from your Sales Table.
You do not have a right link for certain grouping combinations to link from purchases to Sales in your current Data Model.
You want to do an applymap() based on grouping level 1, and drop level 4 from one of your sales/purchases tables..
MAP_GROUPING_DICTIONARY:
Mapping LOAD * INLINE [
"GROUPING LVL 4", "GROUPING LVL 1"
1001, 0001
1002, 0001
2001, 0002
2002, 0002
];
PURCHASES:
Load "SHOP ID", YEAR, MONTH,PURCHASES,
Applymap('MAP_GROUPING_DICTIONARY',"GROUPING LVL 4") as "GROUPING LVL 1"
;
LOAD * INLINE [
"SHOP ID", YEAR, MONTH, "GROUPING LVL 4", PURCHASES
01, 2025, Jan, 1001, 3200
02, 2025, Jan, 2001, 4100
01, 2025, Feb, 1001, 2800
02, 2025, Feb, 2001, 5300
01, 2025, Mar, 1001, 4700
02, 2025, Mar, 2001, 3500
];
SALES:
Load *,
Applymap('MAP_GROUPING_DICTIONARY',"GROUPING LVL 4") as "GROUPING LVL 1";
LOAD * INLINE [
"SHOP ID", YEAR, MONTH, "START DATE", "END DATE", "GROUPING LVL 4", "GROSS SALES", "CATALOG NO", "CATALOG NAME", EAN
01, 2025, Jan, 2025-01-05, 2025-01-15, 1002, 1500, GZ001, WINTER PROMO, 5901234567890
02, 2025, Jan, 2025-01-05, 2025-01-15, 2002, 1700, GZ001, WINTER PROMO, 5901234567891
03, 2025, Jan, 2025-01-05, 2025-01-15, 2002, 1800, GZ001, WINTER PROMO, 5901234567892
01, 2025, Feb, 2025-02-01, 2025-02-10, 1002, 2400, GZ002, VALENTINE DEALS, 5901234567890
02, 2025, Feb, 2025-02-01, 2025-02-10, 2002, 2200, GZ002, VALENTINE DEALS, 5901234567891
03, 2025, Feb, 2025-02-01, 2025-02-10, 2002, 2800, GZ002, VALENTINE DEALS, 5901234567892
01, 2025, Mar, 2025-03-10, 2025-03-20, 1002, 3000, GZ003, SPRING OFFER, 5901234567890
02, 2025, Mar, 2025-03-10, 2025-03-20, 2002, 2800, GZ003, SPRING OFFER, 5901234567891
03, 2025, Mar, 2025-03-10, 2025-03-20, 2002, 2400, GZ003, SPRING OFFER, 5901234567892
];
SHOP_DICTIONARY:
LOAD * INLINE [
"SHOP ID", "SHOP NAME"
01, SHOP A
02, SHOP B
03, SHOP C
];
@-SW- thank you very much for your help 🙂