Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have an interesting issue I am trying to resolve.
I load a file where the date is a field name eg 20181019. These go back a few years (3000 fields). I need to create a table populated within a certain range (greater than 20081205).
I could create a loop to look through each field and if they are greater than 200181205 then drop the field but that takes time.
Is there a solution which is more efficient?
Do you mean the say 20181019 is the name of the column?
if so i would recommend a crosstable load so that the instead of columns it becomes rows.
then you can easily filter out the undesired rows through a where clause.
check
Ross,
Two options,
1. Through cross-table wizard, options->file wizard
2. Crosstable() in load script:
DateTemp:
CrossTable
(Date, Datafield, n) //n=depends upon your transformations
LOAD
*
FROM [filepath]
where Date > '20081205'; // you can perform date conversions ahead of this script
NOCONCATENATE
Main:
LOAD *
Resident
DateTemp where Date > '20081205';
Drop Table DateTemp;
I eventually wrote loop to delete out the dated range this I didn't need:
Let i = 1;
do while i <= NoOfFields('RawTable')
let vQVDFieldName = FieldName($(i) ,RawTable);
if isnum(vQVDFieldName) THEN
if vQVDFieldName< 20081206 and vQVDFieldName >=20071203 THEN
ELSE
drop Field [$(vQVDFieldName)];
Let i = i - 1;
end if;
end if;
Let i = i + 1;
loop
Yes, It can work. But given more value fields could impact performance because of multiple if loops.
By the way, small correction in my code:
DateTemp:
CrossTable
(Date, Datafield, n) //n=depends upon your transformations
LOAD
*
FROM [filepath]
NOCONCATENATE
Main:
LOAD *
Resident
DateTemp where Date > '20081205';
Drop Table DateTemp;
Good luck!