Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have the following incremental load Script:
//Load Incremental Data
OLEDB CONNECT32 TO [Provider=SQLNCLI11.1;Persist Security Info=False;User ID=sa;Initial Catalog=PharmaMarketAudit;Data Source=localhost\DATAWAREHOUSE;Use Procedure for Prepare=1;Auto Translate=True;Packet Size=4096;Workstation ID=STRATDIGM_1;Initial File Name="";Use Encryption for Data=False;Tag with column collation when possible=False;MARS Connection=False;DataTypeCompatibility=0;Trust Server Certificate=False;Application Intent=READWRITE];
Incremental:
LOAD [Script Number],
[Drug Name],
Date(Floor([Date Dispensed])) as [Date Dispensed],
Year([Date Dispensed]) * 12 + Month([Date Dispensed]) as PeriodID,
Time(Interval([Date Dispensed],'hh:mm:ss'),'hh:mm:ss') as TRx_Time_Dispensed,
if([Script Number]=Previous([Script Number]),'',Time(Interval([Date Dispensed]-Previous([Date Dispensed])),'hh:mm:ss')) as TRx_Traffic,
//Time([Date Dispensed],'h tt') & ' - ' & Time([Date Dispensed] + maketime(3,0,0),'h tt') as TimeBucket,
DateKey,
//Year([Date Dispensed]) * 100 + Month([Date Dispensed]) + Day([Date Dispensed]) as [DateKey],
// Directions,
[Repeat or Original],
[Doctor Name] as Prescriber,
QTY*1 as QTY,
[MedAid Amount]*1 as [MedAid Amount],
Shortfall*1 as Shortfall,
RangeSum([MedAid Amount],[Shortfall])*1 as Amount,
[Med Aid] as [Payment Type],
[Supply days],
//[Drug Code]
Num([Drug Code],'000000') as [Drug Code],
//Num(fabs([Drug Code]),'000000') as [Drug Code],
// Fabs(Num([Drug Code],'000000')) as [Drug Code],
Num([Doctor Number],'00000') as [Doctor Number],
Upper ([Retail Pharmacy]) as RETAILPHARMACY,
Upper(Branch) as BRANCH,
Date(Date_Modified) as Date_Modified;
SQL SELECT Branch,
DateKey,
"Date Dispensed",
"Date_Modified",
"Doctor Name",
"Doctor Number",
"Drug Code",
"Drug Name",
"Med Aid",
"MedAid Amount",
QTY,
"Repeat or Original",
"Retail Pharmacy",
"Script Number",
Shortfall,
"Supply days"
FROM PharmaMarketAudit.dbo.tblDatawarehouseConsolidatedCleanStage
WHERE [Date Dispensed] > $(vSetLastUpdatedDate) and [Date Dispensed] <= '2015-08-31'
ORDER BY [Retail Pharmacy] asc, [Branch] asc, [Date Dispensed] asc;
//Concatenate with QVD
Concatenate
LOAD [Script Number],
[Drug Name],
[Date Dispensed],
PeriodID,
TRx_Time_Dispensed,
TRx_Traffic,
DateKey,
[Repeat or Original],
Prescriber,
QTY,
[MedAid Amount],
Shortfall,
Amount,
[Payment Type],
[Supply days],
[Drug Code],
[Doctor Number],
RETAILPHARMACY,
BRANCH,
Date_Modified
FROM
$(vFolderSourceData)\QVDs\PrescriptionHistory.qvd
(qvd);
//Replace Old QVD file
STORE Incremental into $(vFolderSourceData)QVDs\
PrescriptionHistory.qvd (qvd);
I then need to construct time buckets from the above as follows:
// Establish sort order for TimeBucket
TimeBucketSortTemp:
LOAD * Inline [
TimeBucket
8am-11am
11am-2pm
2pm-5pm
5pm-8pm
8pm-8am
]
;
// Establish sort order for [Hour Dispensed]
HourSortTemp:
LOAD Time(MakeTime(Hour),'ht') as [Hour Dispensed] INLINE [
Hour
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
0
1
2
3
4
5
6
]
;
TimeBucketMap:
MAPPING
LOAD * INLINE [
Hour, Bucket
0, 8pm-8am
1,8pm-8am
2,8pm-8am
3,8pm-8am
4,8pm-8am
5,8pm-8am
6,8pm-8am
7,8pm-8am
8,8am-11am
9,8am-11am
10,8am-11am
11,11am-2pm
12,11am-2pm
13,11am-2pm
14,2pm-5pm
15,2pm-5pm
16,2pm-5pm
17,5pm-8pm
18,5pm-8pm
19,5pm-8pm
20,8pm-8am
21,8pm-8am
22,8pm-8am
23,8pm-8am
]
;
MainData2:
LOAD *,
ApplyMap('TimeBucketMap',hour(TRx_Time_Dispensed)) as TimeBucket
,Time(MakeTime(hour(TRx_Time_Dispensed)),'ht') as [Hour Dispensed]
,WeekDay([Date Dispensed]) as [Weekday Dispensed]
Resident Incremental
//Resident [Main Data]
;
DROP TABLE Incremental;
//DROP TABLE [Main Data];
STORE [MainData2] into $(vFolderSourceData)QVDs\
PrescriptionHistoryUpdated.qvd (qvd);
Using the [Resident Incremental] data makes the load time high. Is there a way of using the preceding QVD created earlier on: //Replace Old QVD file
STORE Incremental into $(vFolderSourceData)QVDs\
PrescriptionHistory.qvd (qvd);
for the resident statement?
I used a BUFFER prefix to the load statement to speed up load time. This however has to be removed when moving to production as I read from Rob's QV CookBook.
I would appreciate any other solutions.
Regards.