Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi there,
I've got a mapping load within a loop in order to pick the customer name for transactions made in a range of dates. So, in that loop I've got a mapping load for the customer_id and after that a load with an Applymap. The problem is that the mapping doesn't work properly. I gues the problem is that a mapping whitin a loop concatenates the data gathered from each loop. But since it is not possible to drop a mapping table I don't know how to go over this problem. I've already tried adding a numeric value to the customer_id in each loop, but the tables are so large that performance is no acceptable. Any clues?
Tanks to all of you.
can you send us the script?
Hi,
maybe it's better to use a join instead?
- Ralf
Miguel,
I assume you have something like this:
FOR n = 1 to 100
Map:
MAPPING LOAD
...
data:
LOAD
applymap('Map'...)...
RESIDENT
NEXT
I suspect that QlikView is creating maps 'Map-1', 'Map-2', etc. Try this, so you explicitly create map names, and refer to them in the applymap:
FOR n = 1 to 100
Map$(n):
MAPPING LOAD
...
data:
LOAD
applymap('Map$(n)'...)...
RESIDENT
NEXT
Regards,
Michael
Very clever solution, @Michael!
I don't think that QlikView creates Map-1, Map-2, etc... It continues to build up the same mapping table. I see 2 possible issues:
1. If the mapping values aren't changing from one iteration of the loop to another, then the issue is in duplicate values. I'd recommend to simply add DISTINCT
2. If the mapping values should change from one iteration to another, then the old values need to be discarded, and Michael's solution with numbering the Map is perfect!
cheers,
Oleg Troyansky
..then you would need a dynamical nesting ApplyMap(), not a very realistic approach.
- Ralf
Hi Michael,
I agree with Oleg. Your solutions works fine. I tried it before. But I just see a slight problem (or its seems to me it could be), I've got a huge amount of records, so worried about loading so many tables in memory. In case I have to make a loop for a whole Year I would be creating 365 table of more than 2 millions records each. My problem is performance in load time (every 12 hours).
I've been thinking about partial (incremental) reloads in order to release memory after, lets say, 50 loops each.
But I'm still trying to find a work-around.
Regards.
Hi Ralf,
I've already tried with a join. But no luck. Poor performance.
Regards
Hi Ariel, the code is pretty much like the one Michael wrote.
Regards.
"I'd recommend to simply add DISTINCT"
Hi Oleg,
Good point, thanks. In fact, I always use "distinct" with mapping load, unless it is inline. Somehow skipped it in the example above...
Regards,
Michael