Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
I have a table with 30 values reported in a single line, and each value is composed of 5 fields, ex .:
Key1, Key2, Field101, Field201, Field301, Field401, Field501, Field102, Field202, Field302, Field402, Field502, ...Field530
I would have to distribute in rows each group of 5 fields, it is a qvd of several million records, for now I have it done with a loop but it has to read 30 times and it takes a lot. Do you know if there is a way to avoid reading it 30 times?
FOR i = 1 to 30
LET vNum = Num ($(i), '00');
Concatenate (TableName)
LOAD Key1,
Key2,
Field1$(vNum) as Field1,
Field2$(vNum) as Field2,
Field3$(vNum) as Field3,
Field4$(vNum) as Field4,
Field5$(vNum) as Field5
From
[$(vPath)FileName.qvd]
(qvd)
Where
Field1$(vNum) <> 0
;
NEXT
I'm not looking for delta/incremental loads solutions.
Thanks,
Regards.
Sorry, I didn't notice that it was really 30 iterations and not just 30 columns. In this case my suggestion would be also need 30 iterations.
Beside them I don't believe that there are really faster solutions than such kind of loop. Of course there are solutions with crosstable thinkable but they are quite performance intense by millions of records so that a optimized qvd-load should be faster.
The only option to keep it optimized is to remove the where-clause (and applying it within a following resident-load) or a where exists() with only one parameter.
Beside this if you could change or maybe extend the logic which creates these qvd you might be able to skip this task.
- Marcus
Can you share a sample data ?
Thanks Michelle, I created some dummy data and the expected result (attached).
Sounds like a variation of crosstab, except you are doing this in groups of 5 rather than one at a time. Obviously the built in crosstab option can't help you. I will have to think about this one.
Yes, I started thinking in a cross table but stopped when I realize that I also needed to do it 30 times, at least the way I was doing it.
One thing that comes to mind is to do a singe load with some RangeSum() expressions. Something like
LOAD
Key1,
Key2,
RangeSum(Field101, Field102, Field103, ... Field130) as Field1,
RangeSum(Field201, Field202, Field203, ... Field230) as Field2,
...
RangeSum(Field501, Field502, Field503, ... Field530) as Field5
FROM [$(vPath)FileName.qvd] (qvd);
However, this does aggregate the values. But as you cannot separate those values anyway (you have no way of separating the values of 101 and 102 from the same line of source once loaded into Field1), this may be an advantage.
have you already tried to work with a resident table and not with qvd ?
Hi Michele, thanks for taking your time in this.
I think in that case I would have to do 30 loads of the resident table and the final load time would be pretty the same, isn't?
I could imagine an approach like this:
for i = 1 to 6
Let vF1 = 'Field1' & num($(i), '00');
Let vF2 = 'Field2' & num($(i) + 1, '00');
Let vF3 = 'Field3' & num($(i) + 2, '00');
Let vF4 = 'Field4' & num($(i) + 3, '00');
Let vF5 = 'Field5' & num($(i) + 4, '00');
Load
Key1, Key2, $(vF1) as Field1, $(vF2) as Field2,
$(vF3) as Field3, $(vF4) as Field 4, $(vF5) as Field5
From [$(vPath)FileName.qvd] (qvd) Where $(vF1) <> 0;
next
and if you could remove the where-clause or changing it to a where exists() you could keep the loads optimized.
- Marcus
I didn't explained it completely, those field values are used as a concatenated key, also shown as selectable codes. Not to do numeric operations with them.
But you give an idea of doing a concat and then a subfield to split the values in rows, I'll try.
Thanks Jonathan.