Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
INPUT :
TAB1:
Load * Inline [
Id , Type ,Date
01,A,1/1/2018
01,B,1/2/2018
01,C,1/3/2018
01,D,1/4/2018
01,E,1/5/2018
01,F,1/6/2018
25,s,1/1/2018
26,D,1/1/2018
];
OUTPUT:
with out using Generic load i need out put like
ID A B C D E F S D
01 1/2/2018
Thanks
raja
You need to hardcode Type value, if you are not ready to use Generic load.
Please try the below script.
Tab2:
Load ID, if(Type='A',Date) as A,
if(Type='B',Date) as B,
if(Type='C',Date) as C,
1 as Value
resident TAB1;
Drop table TAB1;
Tab3:
noConcatenate
Load Sum(Value) as Value,ID,A,B,C
resident Tab2
Group by ID,A,B,C;
Drop table TAB2;
in my case i have 92 fields how to do it can have better way ...
I made minor change in the above script;
I am not sure about other method.
TAB1:
Load * Inline [
Id , Type ,Date
01,A,1/1/2018
01,B,1/2/2018
01,C,1/3/2018
];
Tab2:
Load Id, if(Type='A',Date(Date#(Date,'DD/MM/YYYY'),'DD/MM/YYYY')) as A,
if(Type='B',Date(Date#(Date,'DD/MM/YYYY'),'DD/MM/YYYY')) as B,
if(Type='C',Date(Date#(Date,'DD/MM/YYYY'),'DD/MM/YYYY')) as C
resident TAB1;
Drop table TAB1;
Tab3:
noConcatenate
Load Id,max(A) as A,max(B) as B,max(C) as C
resident Tab2
Group by Id;
Drop table Tab2;
Try this below code:
INPUT :
Load * Inline [
Id , Type ,Date
01,A,1/1/2018
01,B,1/2/2018
01,C,1/3/2018
01,D,1/4/2018
01,E,1/5/2018
01,F,1/6/2018
25,s,1/1/2018
26,D,1/1/2018
];
FinalTable:
LOAD DISTINCT Id
Resident INPUT;
For i = 1 to FieldValueCount('Type')
LET vField = FieldValue('Type', $(i));
Left Join (FinalTable)
LOAD DISTINCT Id,
Date as [$(vField)]
Resident INPUT
Where Type = '$(vField)';
Next
DROP Table INPUT;
in my case Table load time it takes around 10 mins
any better way please suggests me..?
why it takes around 10 mins?
A better way might be to skip this task at all and using another approach to calculate your needed results maybe with some set analysis within UI expressions.
The reason for it is that you create with your approach from a data-stream a large crosstable. Beside a quite expensive handling of these many fields within other tables/dimensions/expressions it's a quite heavy transformation of data and it will always need some time especially if the number of records are rather large.
In your mentioned case it would require 92 loads of a resident-table with a where-clause and joining the result to the final table ... If it's done with a loop like above mentioned or a generic load-approach or any other way ... it couldn't be fast ... compared to a normal data-load.
- Marcus