Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
The following script statement is still scanning through 50 Million records and taking about 15 minutes to re-load the script. Is there anything I can do to more effectively filter or retrieve a sample or smaller size of the records I need in order to ensure testing?
tempDayPoint_Measurement_Summry1:
LOAD
*
Resident Point_Measurement
Where _kHour_Quarter_ID_Autonumber <= (100)
Hi Kevin, First statement are for load not for SQL, regards!
LOAD
A, B, C, D,...
WHERE Date >= '01/01/2015' and Date <= '31/12/2015'
With where you can filter your load for specific data.
Regards
Thank you. This did help the script run 2 min. faster for testing. I am finding out that my sum statements within my script is the bigger culprit for slow speed outputs.
Thank You!
My run time was 1:08 instead of 15+ minutes when I used the STORE function and saved the table as a QVD.
My initial load was still 15 minutes, but then that table is saved as a QVD that can be repetitively referenced in code when I make other adjustments in the future.
Below is my final code output. Hope it helps someone in the future!
/******* STEP 1 - Join In New Field ********/
tempAgg_Table:
Noconcatenate
Load *
Resident Point_Measurement
;
Left Join (tempAgg_Table)
Load
_kHour_Quarter_ID_Autonumber,
DayDate
Resident Calendar_Table
;
/******* STEP 2 - Aggregate Table ********/
Agg_Point_Measurement:
Load
_kFctFacility_Summry as _kFctFacility_Summry,
DayDate,
_kPoint_ID as _kPoint_ID,
sum(CONSUMPTION) as CONSUMPTION
Resident tempAgg_Table
Group by
_kFctFacility_Summry,
DayDate,
_kPoint_ID
;
/******* STEP 3 - Store Aggregate Table as a QVD ********/
STORE * from Agg_Point_Measurement into U:\Users\cwarreke\Scratch Docs\Agg_Point_Measurement.qvd(qvd);
/******* STEP 4 - Substitute This Bit Of Code Instead of running STEP 2 & 3, Comment Out Step 2 & 3 moving forward********/
Load *
From (qvd);