Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Incremental load with Primary Key

Hi

I would like to do incremental load based on primary key .My requirement is to add updated records to table  based on Key .

scenario:

I have attacheed two sample data Key,Place,Room.

i will load these files daily as it is incremental load .

Initialy all records are set to 1 like this

Tab:

Load Key,Place,Room,

1 as Open

From Incr.txt;

I would like to see how many times key is getting repeated every time when we loaded every day.

we have set to Open to 1 because for the first time each record is repeated for 1 time.

i need to load these two files and if any records  are matching with the previous loaded file it should be incremented by 1 else it should take previous value.

This should be done in the script

final output:

   Key Place, Room ,Open

   1      U.k     wing2 , 2

   2     U.s     Wing1, 2

  3    Aus,         Wing2,2

I n the abopve output Open is getting 2 because all records are repeated for  two times.

Thanks,

1 Solution

Accepted Solutions
Anonymous
Not applicable
Author

Hi adamdavi3s

I tried like below its working fine for me.

Test1:

LOAD Key,

Key as match1,

     Place,

     Room,

     1 as Days

FROM

(biff, embedded labels, table is Sheet1$);

if FileTime('C:\Users\Desktop\Incr1.qvd')>0 then

Test2:

load Key,

Place,

     Room,

Key as match2,

if(len(If(exists(match1,Key),Days+1 ,Days ))<1,1,If(exists(match1,Key),Days+1 ,Days )) as Days

from

C:\Users\Desktop\Incr1.qvd(qvd);

DROP Field match1;

Concatenate(Test2)

load *

Resident Test1

where Not Exists(match2,Key);

drop field match2;

store  Test2 into C:\Users\Desktop\Incr1.qvd(qvd);

else

drop Field match1;

STORE Test1 into C:\Users\Desktop\Incr1.qvd(qvd);

ENDIF

drop table Test1;

drop table Test2;

Final:

LOAD Key,

     Place,

     Room,

     Days

FROM

(qvd);




Thanks

View solution in original post

7 Replies
Anonymous
Not applicable
Author

I am sorrry forgot to attach Sample files in mainthread now i attached

Thanks

adamdavi3s
Master
Master

Ok so I am thinking on the fly here

I would have a variable to count the reload, lets say v_reloadcount and set this to one for the first reload of the day (if(hour(now()) =1, 1 kind of idea

I would then split the load into two sections with some trickery:

Firstly load all your key fields into a new field

the first should load all values where key NOT EXISTS (new field)

Then it should increment the counter variable by 1

Then loads all values where it does exist (new field)

drop new field

Hope that makes sense a bit

Anonymous
Not applicable
Author

Hi Adam,

Thanks for the help.Let me give a try .

Thanks

Anonymous
Not applicable
Author

Any Thoughts !!!

Thanks

adamdavi3s
Master
Master

any thoughts on what? Did you work up my suggestion?

Anonymous
Not applicable
Author

Hi adamdavi3s

I tried like below its working fine for me.

Test1:

LOAD Key,

Key as match1,

     Place,

     Room,

     1 as Days

FROM

(biff, embedded labels, table is Sheet1$);

if FileTime('C:\Users\Desktop\Incr1.qvd')>0 then

Test2:

load Key,

Place,

     Room,

Key as match2,

if(len(If(exists(match1,Key),Days+1 ,Days ))<1,1,If(exists(match1,Key),Days+1 ,Days )) as Days

from

C:\Users\Desktop\Incr1.qvd(qvd);

DROP Field match1;

Concatenate(Test2)

load *

Resident Test1

where Not Exists(match2,Key);

drop field match2;

store  Test2 into C:\Users\Desktop\Incr1.qvd(qvd);

else

drop Field match1;

STORE Test1 into C:\Users\Desktop\Incr1.qvd(qvd);

ENDIF

drop table Test1;

drop table Test2;

Final:

LOAD Key,

     Place,

     Room,

     Days

FROM

(qvd);




Thanks

adamdavi3s
Master
Master

Excellent

I just thought with your comment above, that you might not have got it working.

Please remember to mark this as helpful or the correct answer if I have helped you or answered your question.