Skip to main content
Woohoo! Qlik Community has won “Best in Class Community” in the 2024 Khoros Kudos awards!
Announcements
Nov. 20th, Qlik Insider - Lakehouses: Driving the Future of Data & AI - PICK A SESSION
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

Incremental Load Issue

I thought I had my incremental load working fine but I am having an issue using the "where not exists" function. It appears that the call isn't returning all the data I expect.

I have a table in memory called PPTemp. This contains the current day's data, including changes to already reported data. I.e.

%Key
Synd_Line_No
Amount
20120101_12345_111000
20120101_12345_1
21000
20120101_12345_1
31000
20120101_12345_1
41000
20120101_12345_1
51000

I then have a QVD file containing all data processed on previous days:

%Key
Synd_Line_NoAmount
20120101_12345_11500
20120101_12345_12500
20120101_12345_13500
20120101_12345_14500
20120101_12345_15500
20120101_67890_112000
20120101_67890_122000
20120101_67890_132000
20120101_67890_142000
20120101_67890_152000
20120131_54321_115000
20120131_54321_125000
20120131_54321_135000
20120131_54321_145000
20120131_54321_155000

So as you can see, the data I am loading transaction 20120101_12345_1 again, because the amount has changed.

I have the following script:

Concatenate (PPTemp) LOAD *

                    From $(vDriveLetter)$(vOutputQvd)$(vFileName)(qvd)

                    Where not Exists(%Key);

I would expect 10 rows to be concatenated onto PPTemp from the QVD file, but I am only getting 2. I am each distinct %Key that doesn't exist, but only one Syndicate Line of data for each.

Has anyone seen this problem before, and can you suggest a way to go about fixing it?

Many thanks

2 Replies
rwunderlich
Partner Ambassador/MVP
Partner Ambassador/MVP

The problem arises because the not exists on a single field picks up only one row for each key from the QVD. The workaround is to copy the delta keys to a new field and use that in the exists. For example:

keys_temp:

LOAD %Key as %tempKey RESIDENT PPTemp;

Concatenate (PPTemp) LOAD *

From $(vDriveLetter)$(vOutputQvd)$(vFileName)(qvd)

Where not Exists(%tempKey, %Key);

DROP TABLE keys_temp;

Or use Qlikview Components http://qlikviewcomponents.org like this:

SET Qvc.Loader.v.KeyFieldIsUnique = 0;

CALL Qvc.UpdateQvd ('PPTemp', '$(vDriveLetter)$(vOutputQvd)$(vFileName)', '%Key');

-Rob

http://robwunderlich.com

Anonymous
Not applicable
Author

Many thanks, I'll give this a try next time. As I was up against it somewhat, I reversed the logic for this exercise.

I.e.

- Stored the working data into a temporary QVD

- Created a unique list of 'old' keys that hadn't been worked on

- Loaded from the QVDs where these 'old' keys exist

- Concatenated the working data back in again

Seems to be working fine and I am not losing any of the optimisation.