I am trying to tweak the reference guide's incremental load example (case 4: insert, update and delete - as on p512 of the attached reference manual) as follows:
Instead of using a database table with records, I want to use a folder location with Excel files. I want the incremental load to work so that only new/updated files are read in, and any that have been deleted since the last execution are not loaded in. For the moment I'm just trying to get a single QV table with 2 fields: FileName() as File, and FileTime() as ModifiedDate. So my equivalent to the PrimaryKey field is FileName(), my equivalent to DB_TABLE is sort of the folder of files, and instead of a DB field "ModificationTime" I'm using FileTime().
Later on when I've got this simple file list load to work I'm going to loop over each file in the list and read in data from each (I've done this before successfully but without the incremental load). But for now I can't even get the incremental load to work with the file list.
My script is attached. I have started the process off by just commenting out lines 23-24, 32-36 and 47-51. So this just gets the complete file list when I reload (and creates an initial QVD with the full list). Then I have uncommented 23-24 and 32-36, and this works - it gets any new files (18-26) and any "old" files that haven't been updated/added since the last execution. This works, and after such a reload I get the QVD which I've attached.
My problem comes when I uncomment lines 47-52, the inner join statement - which I believe is supposed to cut out any files that have deleted since the last script execution (as the inner join returns the intersection of the current full file list with the new/updated/old ones, thus removing any deleted ones which are in the new/updated/old set). When I load the script with this bit uncommented, there are no errors but all my data goes - as if the intersection is empty. But this can't be right, as I've tried the script with everything commented out EXCEPT 48-52 (the table we are inner joining) and this is the same set of data as I had with ONLY 47-52 commented out.
I hope this makes sense.
I've attached the source files I've been using too so you can test it yourself.
Thanks very much for your help - any explanation would be really appreciated!