Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have implemented incremental load in my project for big table it has contain approximately 40 million records. It is working fine for Insert, Update and Delete.
Problem is when I run this incremental load for Insert and Update it is not taking much time but when it goes to Delete it is comparing all Primary Key values and taking 35 minutes time.
PFB sample logic I implemented in my Project.
LET ThisExTime = now();
IncrementalTest:
SQL SELECT CreatedDate,
"Employee_Id",
ID,
Name
FROM QlikViewDev.dbo.TempInsertIncrementalTest
where CreatedDate >= '$(LastExTime)' and CreatedDate < '$(ThisExTime)';
Concatenate
LOAD CreatedDate, Employee_Id,ID, Name
FROM [..\QVDs\PB\Test\IncrementalTest.qvd] (qvd)
Where not Exists(Employee_Id);
Inner Join
SQL SELECT Employee_Id
FROM QlikViewDev.dbo.TempInsertIncrementalTest2;
LET LastExTime = '$(ThisExTime)';
Please let me know if any solution available to avoid this time taken.
Thanks,
Chandra
have you already tried a "distinct employee id". You said you have 30 million records but since you are doing the delete operation on employee id so I think it would be better to check the unique values of employee id instead of all records in your inner join.
Thanks for your reply.
we have unique Employee_Id's in my DB.
Hi Sudeep,
There is no other options for reducing the incremental load apart from the way you followed. That's the standard way for incremental load.
however try using this methodology in your incremental load
https://qlikviewcookbook.com/2013/08/super-fast-method-to-retrieve-qvd-high-value/
Many Thanks
Karthik
When the ID is unique, why don't you load the old data and add only the new data, for which there is no ID known?
Data: LOAD ID, .... FROM OldData;
NewData: CONCATENATE (Data) LOAD ID, ..... FROM ..... WHERE NOT EXISTS (ID);