
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Nested Apply Map !?
Hi,
I have a Table with over 30 million records.
It is to be used as to crete multiple ApplyMap wherein the checking field (the first field) is always the same for those ApplyMaps.
Cannot use the Associative concept here.
Is there any way to optimize it , so that the Table is loaded only once but still maintain the ApplyMap.
Appreciate any help.
Thanks !
- Tags:
- qlikview_scripting
- « Previous Replies
-
- 1
- 2
- Next Replies »
Accepted Solutions

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Basically you have 30M rows in dimension table. I believe you have some resource concerns to join the Dimension table into fact table. So you doing with ApplyMap. But with ApplyMap, you can add one one field with one mapping table. If you want 5 fields you need 5 mapping table assuming all the mapping tables haring one common key field..
Instead of doing 5 mapping tables, create one mapping table first field is your Key Field and second key is concatenated of all 5 fields needs to add to Fact table.
Map_Dim:
Mapping Load KeyID, Dim1&'|'&Dim2&'|'&Dim3&'|'&Dim4&'|'&DIm5 as Value
From DImension;
So use above mapping table to add the dimension values list & separate the values with subfield function.


- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
So your 30mil record table is a mapping table or other fact/dimension table?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
don't think
you can try to store the fields you need (first field + .....) in a qvd and load many times from that qvd (for the mapping tables) instead of many resident load

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Would you mind sharing your script?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
Its not a complex script actually.. its just the Data that is huge !
For your assistance I have mocked it ;
TabA:
Key_ID,
Sys_Key,
Sys_Name,
Sys_Description;
GetSysKey:
Mapping Load
Key_ID,
Sys_Key
resident TabA;
GetSysName:
Mapping Load
Key_ID,
Sys_Name
resident TabA;
....So on.
These Mapping tables will be used to fetch the respective fields... For all the Mapping Tables 'Key_ID' Is the Matching Key.
Is it helpful ?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi,
It can be taken as a Dimension Table.


- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Its not possible i guess.
Your mapping table should always have 2 fields and not more than that.
So if you want different mapping fields for different applymaps from the same 30mil record table, you have to load it multiple times as multiple mapping tables..
As massimop
suggested you can store the 30mil table in a qvd and load the required field into mapping tables.
Thanks

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
As maxgro suggested, you could retrieve them from your QVD that could be more efficient than a resident read
GetSysKey:
Mapping Load
Key_ID,
Sys_Key
from yourqvd(qvd);
GetSysName:
Mapping Load
Key_ID,
Sys_Name
from yourqvd(qvd);
and so on

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can try with one mapping table like below and extract the values with sub field function.
MAP_FIELDSTRING:
LOAD
KEY_ID,
Sys_Key & '|' & Sys_Name & '|' & Sys_Description AS ConcatedString
From source;
LOAD
KEY_ID,
subfield(ConcatedString,'|',1) as Sys_Key,
subfield(ConcatedString,'|',1) as Sys_Name
.
.
.
.
.
LOAD
KEY_ID,
ApplyMap('MAP_FIELDSTRING',KEY_ID,null()) AS ConcatedString
From AnotherSource
;

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I suggested from qvd because usually is faster than resident but, after some test, it seems that creating the mapping table with a resident load is faster than creating the same tables from a qvd
The reason seems to be the mapping load from a qvd isn't optimized (without mapping 00:00:35).
this is the log with the time, 10M (not 30M) of records
12/08/2015 21:25:16
TabA << AUTOGENERATE(10000000) 10.000.000 lines fetched
12/08/2015 21:26:24
time create TabA=00:01:08
12/08/2015 21:26:24
12/08/2015 21:26:54
time create qvd=00:00:30
12/08/2015 21:26:54
GetSysKey << TabA 10.000.000 lines fetched
GetSysName << TabA 10.000.000 lines fetched
GetSysDescription << TabA 10.000.000 lines fetched
12/08/2015 21:28:41
time create mapping table resident=00:01:47
12/08/2015 23:36:08
GetSysKey << Taba 10.000.000 lines fetched
GetSysName << Taba 10.000.000 lines fetched
GetSysDescription << Taba 10.000.000 lines fetched
12/08/2015 23:38:18
time create mapping table qvd=00:02:10
and this is the script if someone else want to check on his pc (you have to comment / uncomment the script to test the 2 diffferent way)
SET ThousandSep=',';
SET DecimalSep='.';
SET MoneyThousandSep=',';
SET MoneyDecimalSep='.';
SET MoneyFormat='€ #.##0,00;-€ #.##0,00';
SET TimeFormat='hh:mm:ss';
SET DateFormat='DD/MM/YYYY';
SET TimestampFormat='DD/MM/YYYY hh:mm:ss[.fff]';
SET MonthNames='gen;feb;mar;apr;mag;giu;lug;ago;set;ott;nov;dic';
SET DayNames='lun;mar;mer;gio;ven;sab;dom';
//////////// make 10M records qvd
//let start=now(1); trace $(start);
//
//TabA:
//load
//rowno() as Key_ID,
//'key ' & rowno() as Sys_Key,
//Hash128(rand()*100000) as Sys_Name,
//Hash128(rand()*10000) as Sys_Description
//AutoGenerate 10000000;
//
//let end=now(1); trace $(end);
//let diff= interval(rangemax('$(end)')-rangemax('$(start)'));
//TRACE time create TabA=$(diff);
//
//////////// store qvd
//let start=now(1); trace $(start);
//
//store TabA into TabA.qvd (qvd);
//
//let end=now(1); trace $(end);
//let diff= interval(rangemax('$(end)')-rangemax('$(start)'));
//TRACE time create qvd=$(diff);
//
//////////// mapping from resident
//let start=now(1); trace $(start);
//
//GetSysKey:
//Mapping Load
//Key_ID,
//Sys_Key
//resident TabA;
//
//GetSysName:
//Mapping Load
//Key_ID,
//Sys_Name
//resident TabA;
//
//GetSysDescription:
//Mapping Load
//Key_ID,
//Sys_Description
//resident TabA;
//
//let end=now(1); trace $(end);
//let diff= interval(rangemax('$(end)')-rangemax('$(start)'));
//TRACE time create mapping table resident=$(diff);
//
//drop table TabA;
//////////// mapping from qvd
let start=now(1); trace $(start);
GetSysKey:
Mapping Load
Key_ID,
Sys_Key
from TabA.qvd (qvd);
GetSysName:
Mapping Load
Key_ID,
Sys_Name
from TabA.qvd (qvd);
GetSysDescription:
Mapping Load
Key_ID,
Sys_Description
from TabA.qvd (qvd);
let end=now(1); trace $(end);
let diff= interval(rangemax('$(end)')-rangemax('$(start)'));
TRACE time create mapping table qvd=$(diff);

- « Previous Replies
-
- 1
- 2
- Next Replies »