Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I am new to qlikview and looking for some help or a possible solution to a query I have encountered when linking Qlikview with our internal Support System.
There is 1x main FACT table (Support_Hd) which contains a number of fields which are populated with internal references (values) which are
Generated when a log is raised or edited, these values are based on entries in a Lookup table.
// Load FACT Table
Support_Hd:
LOAD
`log_ref` AS STHD_LogReference, (Promary key field)
//Linking field - Used to link to the Support_Dt table
`log_ref` AS KEY_LogReference, (Int Ref Value)
`log_stat` AS KEY_LogStatus, (Int Ref Value)
`Cont_Meth` AS KEY_ContactMethod, (Int Ref Value)
`Prod_lkp` AS KEY_ProductLookup, (Int Ref Value)
`Dept_lkp` AS KEY_DepartmentLookup (Int Ref Value)
;
SQL SELECT *
FROM `Support_Hd` where log_dt > {^2012-01-01};
I am currently having to link the Lookup table 4x times for each lookup and this doesn’t seem the best or most optimised solution.
// Load LOOKUP Table
Lookup:
LOAD
`decode_ref` AS KEY_LogStatus, (Int Ref Value)
code,
description,
tag,
;
SQL SELECT *
FROM lookup;
I have attempt to use a Mapping Load command to apply the description to the main FACT table (Support_Hd) but the script
Fails due to the two comparison fields are not the same even though they contain the same reference values.
Mapping Code below:
MAP_Lookup:
LOAD
//`decode_ref`,
applymap('MapName',`decode_ref`) AS log_stat,
`description
;
SQL SELECT *
FROM chsysdec;
Applymap(MAP_Lookup,`log_stat`) AS STHD_LogStatus,
Any help or solution would be much appreciated
You have 2 choices here and it depends which server you want to put the load on.
1. Perform the joins in SQL. SQL is pretty efficient in basic joining so you could join the lookup table 4 times in SQL. This would put more load on the SQL server but unless you are dealing with huge volumes I would guess it's not significant.
2. Create 4 mapping tables in QlikView and use ApplyMap() as per your thinking. This would put more load on the QlikView server and extend the script execution time - although again it may not be significant.
I would probably go for option 1 unless I wanted to use the mapping tables elsewhere in the script or had some more logic to apply to their creation. If you go for option 2, the script should look like:
//Create mapping tables
//First, load entire lookup table
Lookups:
LOAD * ; SQL SELECT....Support_Dt...;
Map_ContactMethod:
MAPPING LOAD DISTINCT
Cont_Meth,
Description
RESIDENT Lookups;
Map_Product:
MAPPING LOAD DISTINCT
Prod_lkp,
Description
RESIDENT Lookups;
Map_Department:
MAPPING LOAD DISTINCT
Dept_lkp,
Description
RESIDENT Lookups;
DROP TABLE Lookups;
//Now load the main table, using the mapping functions
Support_Hd:
LOAD
log_ref,
log_stat,
ApplyMap('Map_ContactMethod',Cont_Meth,'<Unknown>' & Cont_Meth) AS ContactMethod //this makes it clear to users where there is bad or missing data in the lookup table
ApplyMap('Map_Product',Prod_lkp,'<Unknown>' & Prod_lkp) AS Product
ApplyMap('Map_Department',Dept_lkp,'<Unknown>' & Dept_lkp) AS Department
; SQL SELECT....Support_Hd....;
Hope this helps,
Jason
Where is your mapping table?
Have you created a mapping table like this:
MapTable:
Mapping Load Inline[
decode_ref, Description
1, 'Desc'
2, 'Desc'
];
After you map the values of your field, you have to do the applymap on the table you want like you did before.
MAP_Lookup:
LOAD
//`decode_ref`,
applymap('MapTable',`decode_ref`) AS log_stat,
description
From TABLEXXX
;
This may write the values of the field Description or whatever you need to the rows that `decode_ref` is the same than the MapTable.
Hope this helps, sorry about my English
You have 2 choices here and it depends which server you want to put the load on.
1. Perform the joins in SQL. SQL is pretty efficient in basic joining so you could join the lookup table 4 times in SQL. This would put more load on the SQL server but unless you are dealing with huge volumes I would guess it's not significant.
2. Create 4 mapping tables in QlikView and use ApplyMap() as per your thinking. This would put more load on the QlikView server and extend the script execution time - although again it may not be significant.
I would probably go for option 1 unless I wanted to use the mapping tables elsewhere in the script or had some more logic to apply to their creation. If you go for option 2, the script should look like:
//Create mapping tables
//First, load entire lookup table
Lookups:
LOAD * ; SQL SELECT....Support_Dt...;
Map_ContactMethod:
MAPPING LOAD DISTINCT
Cont_Meth,
Description
RESIDENT Lookups;
Map_Product:
MAPPING LOAD DISTINCT
Prod_lkp,
Description
RESIDENT Lookups;
Map_Department:
MAPPING LOAD DISTINCT
Dept_lkp,
Description
RESIDENT Lookups;
DROP TABLE Lookups;
//Now load the main table, using the mapping functions
Support_Hd:
LOAD
log_ref,
log_stat,
ApplyMap('Map_ContactMethod',Cont_Meth,'<Unknown>' & Cont_Meth) AS ContactMethod //this makes it clear to users where there is bad or missing data in the lookup table
ApplyMap('Map_Product',Prod_lkp,'<Unknown>' & Prod_lkp) AS Product
ApplyMap('Map_Department',Dept_lkp,'<Unknown>' & Dept_lkp) AS Department
; SQL SELECT....Support_Hd....;
Hope this helps,
Jason
José
Thanks very much for the help
Regards
Frank
Thanks very much for your help.
Regards
Frank
If you have solved your problem please mark a correct answer to close the thread.
Cheers,
Jason