Skip to main content
Announcements
Join us at Qlik Connect for 3 magical days of learning, networking,and inspiration! REGISTER TODAY and save!
cancel
Showing results for 
Search instead for 
Did you mean: 
DavidJG
Contributor III
Contributor III

round value to custom thresholds

Hi everyone, 

I have a problem where I want to round up a value (weight in my case which can be any value) to specified values/thresholds. this is because I want  to map a numeric value to a "group" so I can later map the weigh-group to a cost.

map:
mapping LOAD *
Inline [
weight_threshold, cost
50, 111
100, 222
250, 333
500, 444
] ;

So I want a value of "5" to be rounded up to "50", and likewise "51" be rounded up to "100".
So I need to do the rounding to the column "weight" before using it in the applymap()

Result:
LOAD *,
ApplyMap('Map', weight, null()) as cost
FROM somthing;

normally outside of qlik I would try to loop through the list of threshold to find the first threshold bigger than my current value and set the current value to that threshold; and if the value is bigger than the biggest threshold, set it to the highest threshold.

I would greatly appreciate any suggestions or hints on how I can solve this problem.

Thanks
/David

1 Solution

Accepted Solutions
marcus_sommer

I think a classical rounding won't be possible. An approach like your your nested if-loop could be a bit more readable with a pick(match()) replacement - whereby it wouldn't change the logic.

Easier than this may be to cluster the weight with num(class(weight, 50)) but for different cluster-sizes you would need again if-loops (whereby probably not so much as before) or you could extend your mapping-source appropriate.

Personally I would tend to prepare/extend the mapping table so that all needed values are included. That's rather an easy job with a while loop approach like the following:

map:
// mapping load value, cost;
load *, ([weight_start] + iterno()) as value while [weight_start] + iterno() <= [weight_threshold];
load *, alt(num#(previous([weight_threshold])), 0) as [weight_start] inline [
weight_threshold, cost
50, 111
100, 222
250, 333
500, 444
] ;

- Marcus

View solution in original post

5 Replies
DavidJG
Contributor III
Contributor III
Author

This works but looks awful and is probably problematic. And I have more "weight-groups" than this as well.

Result:
LOAD *,
ApplyMap('Map', 
if(weight<50, 50, 
if(weight < 100,100,
if(weight < 250, 250, 500

))),null()) as cost
FROM somthing;

marcus_sommer

I think a classical rounding won't be possible. An approach like your your nested if-loop could be a bit more readable with a pick(match()) replacement - whereby it wouldn't change the logic.

Easier than this may be to cluster the weight with num(class(weight, 50)) but for different cluster-sizes you would need again if-loops (whereby probably not so much as before) or you could extend your mapping-source appropriate.

Personally I would tend to prepare/extend the mapping table so that all needed values are included. That's rather an easy job with a while loop approach like the following:

map:
// mapping load value, cost;
load *, ([weight_start] + iterno()) as value while [weight_start] + iterno() <= [weight_threshold];
load *, alt(num#(previous([weight_threshold])), 0) as [weight_start] inline [
weight_threshold, cost
50, 111
100, 222
250, 333
500, 444
] ;

- Marcus

DavidJG
Contributor III
Contributor III
Author

Neat solution and thanks for the tips!

DavidJG
Contributor III
Contributor III
Author

my biggest weight threshold is 6000 (possibly 10k in the future) and I have  a total of 12 thresholds (~18 in the future), my question here is if it's reasonable to have a mapping table this large? (mapping 6k keys to 12 different values). With "reasonable" I of course am thinking about memory/speed/efficiency/nice looking code. 

marcus_sommer

I think you won't get any performance-challenge with such a small mapping. Mapping in Qlik is very fast and AFAIK no other measure will be faster, more efficient, easier to use and more readable as mapping.

I use it quite often - even with millions of records and many one and also multiple times nested (applying the next mapping within the non-matching default parameter as well as using string-concatenation of the return values which are separated with a subfield() again) and it worked usually very smoothly. Of course such heavy transformations have a noticeable impact but compared with the alternatives it's more simple and faster.

Therefore I wouldn't try to optimize this mapping/applymap. Nevertheless you may try it by combining the above mentioned class() method with the mapping maybe by multiplying the iterno() with the bucket-size like:

... + (iterno() * 50)

and/or combining it with some direct rounding and/or using several mappings which are then nested, maybe:

applymap(pick(floor(len(weight)/3), 'm1', 'm2', 'm3'), ...)

or similar measures with which you could reduce the mapping-size. By around 6k - 10k of records which may be reduced to maybe 10% of it I doubt that you will see a difference within the run-times - especially as any rounding/clustering/switching in regard of the lookup value will also need a processing. Even by millions of mapping-records I would only look for any optimizations if the overall-run-time of the whole script becomes the bottleneck.   

- Marcus