Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in NYC Sept 4th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Not applicable

Incremental Load Daily

Hi,

I am creating QVD for every individual table involved in report based on incremental load on daily basis

please help suggest on sample script which I can implement

9 Replies
Not applicable
Author

Hey,

Start with this awesome post from stevedark

http://www.quickintelligence.co.uk/qlikview-incremental-load/

BR,

Kuba

stevedark
Partner Ambassador/MVP
Partner Ambassador/MVP

Thanks for the mention Kuba.

Not applicable
Author

Hi,

I am implementing incremental load for a particular table, I have created qvw file which consist of load script for this table and I placed it in workbook folder, I am using Oracle database

What will be the next step to implement incremental load?

stevedark
Partner Ambassador/MVP
Partner Ambassador/MVP

Follow the link given by Kuba and you will find full instructions for a number of different incremental scenarios.

If you can't get these working, then please post back here.

Not applicable
Author

Hi,

I have created below script for incremental load (insert and update) for single table based on link provided in above post, please suggest on below

1)

a) My "Workbook" folder consist of QVDs and QVWs for each individual table , this is full load ,

"QVD/Source"  folder consist of QVDs and QVWs with Incremental load for each individual table and "QVD/Transformed" folder consist of resulting QVDs and QVWs after applying transformations, please suggest whether this folder structure approach is correct or not

Note: We need QVD copy intially from Workbook Folder into Source Folder


2) I have implemented Incremental load but I am not sure how do I test it , all tables belongs to oracle database


3) I did not get Delete procedure of incremental load


Incremental Load Script:


Table1:

LOAD

Field1,

Field2,

Field3

FROM

[..\..\Workbooks\Table1.qvd]

(qvd);

//Latest Date:

Last_Updated_Date:

Load max(Field3) as MaxDate

resident Table1;


// Store into variable

LET Last_Updated_Date=Peek('MaxDate',0,'Last_Updated_Date');


Drop table Table1;

Let vLoadTime=Num(Now());

Incremental:

LOAD Field1,

  Field2,

  Field3;

   

SQL SELECT

    Field1

Field2,

Field3

FROM owner.Table1

Where Field3 > $(Last_Updated_Date) AND Field3< $(vLoadTime);

Concatenate

LOAD Field1,

     Field2,

     Field3

FROM

Table1.qvd

(qvd) Where not exists(Field1);

If ScriptErrorCount=0 then

Store Incremental into Table1.qvd(qvd);

Let Last_Updated_Date=vLoadTime;

End If

Drop table Incremental;


stevedark
Partner Ambassador/MVP
Partner Ambassador/MVP

Hi Vibu,

That looks great, assuming field 1 is a unique ID and field 3 is a last

updated date.

You could simplify if the date is a sequential insertion date, as you can

remove the drop and where exists (as you can be confident of no dupes).

If the date is an update date (and old records can be updated) then the

steps you have are fine. You can speed up a little by only loading field 3

first time around. The other fields are not needed.

Test by checking the variable values, number of rows of each part of the

load, and the number of rows of the resultant table.

If any test does not give the expected result post back here.

Steve

Not applicable
Author

Hi,

I am getting 11628 value for Last_Updated_Date which is not correct , after debugging script I found that value of max(Field3)  returning null inside below statement


Last_Updated_Date:

Load max(Field3) as MaxDate

resident Table1;

Not applicable
Author

Hi,

I am getting correct max date now, issue was due to preceding load while creating qvd generator in workbook folder , due to presence of preceding load it was reducing some rows , after mentioning no concatenate between LOAD and SQL  , it is working now

stevedark
Partner Ambassador/MVP
Partner Ambassador/MVP

Glad it is now working. Thank you for the update.