Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All,
Greetings of the day.
The Order by Clause that I have defined in the script consuming above 5 to process the data. Can someone help me to optimize this script?
LOAD CraneID & ' ' & Upper (cp_id) as IMEI_cp_id,
CraneID as Parameter_IMEI,
CraneID as Active_Parameter_IMEI,
Upper (cp_id) as Parameter_cp_id,
Dayname (datetime) as Parameter_date,
Dayname (datetime) as Date,
parameter as Parameter_id,
value,
value as Parameter_value_cumulative,
If (Previous (CraneID) = CraneID AND Previous (cp_id) = cp_id AND Previous (parameter) = parameter, value - Previous (value), 0) as Parameter_value,
Rowno () as Rowno
RESIDENT Paramater
WHERE Len (CraneID) > 0
ORDER BY CraneID,
cp_id,
parameter,
datetime;
Thanks in Advance,
Rajasekhar P.
Sometimes it's faster to split a larger transformation into several logically tasks. In your case it could mean to filter at first the data (applying the where-clause) and ordering the data within another load and maybe even applying the previous() stuff within a following step. In each case you need to keep an eye on the task-manager to check if there is enough RAM available and if not considering to store the intermediate steps into a qvd and then dropping the tables and loading from the qvd again.
I'm not sure if this in your case makes a significantly difference but you may try it. More effect as this might be possible if you could combine your order by fields within the load before - all your fields seems to be numeric so you could develop a logic to add and multiply them. Sounds more complicated as it is - you need only to be carefully that the results are always unique and not longer as 14 digits. Here a practically example from one of my keys:
([StoreID] * pow(10, 4)) + ([OrderID]) * pow(10, 8)) + [OrderlineID]
With each field which you could save from applying in the order by statement the performance should increase.
- Marcus
Sometimes it's faster to split a larger transformation into several logically tasks. In your case it could mean to filter at first the data (applying the where-clause) and ordering the data within another load and maybe even applying the previous() stuff within a following step. In each case you need to keep an eye on the task-manager to check if there is enough RAM available and if not considering to store the intermediate steps into a qvd and then dropping the tables and loading from the qvd again.
I'm not sure if this in your case makes a significantly difference but you may try it. More effect as this might be possible if you could combine your order by fields within the load before - all your fields seems to be numeric so you could develop a logic to add and multiply them. Sounds more complicated as it is - you need only to be carefully that the results are always unique and not longer as 14 digits. Here a practically example from one of my keys:
([StoreID] * pow(10, 4)) + ([OrderID]) * pow(10, 8)) + [OrderlineID]
With each field which you could save from applying in the order by statement the performance should increase.
- Marcus
If Marcus' post was what you needed, please be sure to close out your thread by using the Accept as Solution button on his post to give him credit for the help and to let other Members know that worked.
Here are a couple of Help links that may be useful too:
Regards,
Brett