Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
Generic load works perfect when we have few values in rows that need to be converted into columns.
But I have column which has thousands of rows that need to be converted to columns.
Any suggestion or work around???
Currently I am using below code, but it is taking hours to complete.
Set vListOfTables = ;
For vTableNo = 0 to NoOfTables()
Let vTableName = TableName($(vTableNo)) ;
If Subfield(vTableName,'.',1)='GenericLabel' Then
Let vListOfTables = vListOfTables & If(Len(vListOfTables)>0,',') & Chr(39) & vTableName & Chr(39) ;
End If
Next vTableNo
CombinedGenericTable:
Load distinct Key From GenericDB;
For each vTableName in $(vListOfTables)
Left Join (CombinedGenericTable) Load * Resident [$(vTableName)];
Drop Table [$(vTableName)];
Next vTableName
As Marco suggested, marcowedel you do not need to join the tables created by the Generic load.
It is the join that is causing the slow load not the generic table processing.
Have you considered any alternative approach to the transforming of your table into thousands of columns? I couldn't imagine that this is really practically.
- Marcus
Petter I think one potential issue is that we would need a if statement for each of the new column we are trying to build. Is that true? I feel what if the columns change dynamically where new things get added and old get removed, this might get a little difficult to maintain.
What is your opinion on this approach -> Re: Any alternate to Generic load? Is this going to be slower compared to the Group By approach?
Also, as a separate question, we have seen some very poor performance of Group By statement on the server at my company. Not sure what the reason behind poor performance on Server is, but assuming everything else constant are loops better than Group By or Group by is better than loops? -> This is for my personal knowledge as I am working on getting rid of Group By statements and replacing them with For Loops to improve performance.
I'm not sure but I feel everything would be faster then a group by load ... whereby I haven't used it in a usecase like this. Are there really performance issues (with group by or a loop) I would think if it's possible to create it with loops on the field-values - I mean to run through internal tables.
Beside them I would check if I really need these process steps to get the end-results which I need (see also my comment from above).
- Marcus
I have no idea about any alternate approach.
I think the Inline would work if I have less number of Values that needs to be converted to columns, but in my case I have thousands of Values.
How do I get the columns to row in below example
Source Table
KeyID ColumnID ColumnValue
1235 ProductName ABCD Product
1235 QTY 50
1235 Status Active
1235 ModDate 06/05/2018
1236 ProductName AAAA Product
1236 QTY 70
1236 Status Inactive
1236 Moddate 08/05/2018
1240 ProductName BBB Product
1240 QTY 67
1240 Status Active
1240 Moddate 10/05/2018
Result table !
KeyID ProductName QTY Status ModDate
1235 ABCD Product 50 Active 06/05/2018
1236 AAAA Product 70 Inactive 08/05/2018
1240 BBB Product 67 Active 10/05/2018 !
Any help would be appreciated
I tried with generic load however it's duplicate the rows. Therefore looking for deferent method
Hi Sunny,
I have data that exist in this particular format :
RISK_EXPDEF_NAME | FACILITY_NUMBER | INTERNAL_BUNIT | INTERNAL_LENTITY | T_FIN_STRATEGY | T_PLL_YEAR | TOTAL_LIMIT |
Cumulative Loss | FOSP | FO - BU | CL - LE | System Trading | -1000000 | |
Cumulative Loss | FOSP | FO - BU | CL - LE | Third Party Trading | -1000000 | |
Cumulative Loss | FOSP | SP - BU | CL - LE | System Trading | -1000000 | |
Cumulative Loss | FOSP | SP - BU | CL - LE | Third Party Trading | -1000000 | |
VaR | FOSP | FO - BU | CL - LE | System Trading | 1000000 | |
VaR | FOSP | FO - BU | CL - LE | Third Party Trading | 1000000 | |
VaR | FOSP | SP - BU | CL - LE | System Trading | 1000000 | |
VaR | FOSP | SP - BU | CL - LE | Third Party Trading | 1000000 |
I want to move RISK_EXPDEF_NAME values to column and list the values TOTAL_LIMIT under it, e.g.
FACILITY_NUMBER | INTERNAL_BUNIT | INTERNAL_LENTITY | T_FIN_STRATEGY | T_PLL_YEAR | Cumulative Loss | VaR |
FOSP | FO - BU | CL - LE | System Trading | 100000 | -100000 | |
FOSP | FO - BU | CL - LE | Third Party Trading | 100000 | -100000 | |
FOSP | SP - BU | CL - LE | System Trading | 100000 | -100000 | |
FOSP | SP - BU | CL - LE | Third Party Trading | 100000 | -100000 |
How to achieve this particular case with above mentioned script?
Regards
Sushant
Here you go
Table: LOAD * INLINE [ RISK_EXPDEF_NAME, FACILITY_NUMBER, INTERNAL_BUNIT, INTERNAL_LENTITY, T_FIN_STRATEGY, T_PLL_YEAR, TOTAL_LIMIT Cumulative Loss, FOSP, FO - BU, CL - LE, System Trading, , -1000000 Cumulative Loss, FOSP, FO - BU, CL - LE, Third Party Trading, , -1000000 Cumulative Loss, FOSP, SP - BU, CL - LE, System Trading, , -1000000 Cumulative Loss, FOSP, SP - BU, CL - LE, Third Party Trading, , -1000000 VaR, FOSP, FO - BU, CL - LE, System Trading, , 1000000 VaR, FOSP, FO - BU, CL - LE, Third Party Trading, , 1000000 VaR, FOSP, SP - BU, CL - LE, System Trading, , 1000000 VaR, FOSP, SP - BU, CL - LE, Third Party Trading, , 1000000 ]; FinalTable: LOAD DISTINCT FACILITY_NUMBER, INTERNAL_BUNIT, INTERNAL_LENTITY, T_FIN_STRATEGY, T_PLL_YEAR Resident Table; FOR i = 1 to FieldValueCount('RISK_EXPDEF_NAME') LET FieldValue = FieldValue('RISK_EXPDEF_NAME', $(i)); Left Join (FinalTable) LOAD FACILITY_NUMBER, INTERNAL_BUNIT, INTERNAL_LENTITY, T_FIN_STRATEGY, T_PLL_YEAR, TOTAL_LIMIT as [$(FieldValue)] Resident Table Where RISK_EXPDEF_NAME = '$(FieldValue)'; NEXT DROP Table Table;
Thanks a lot Sunny, Its working well.
My Request : If you can add comments (to your original code ) if will help a lot of users to understand the logic quickly .