2 Replies Latest reply: Oct 3, 2016 2:24 PM by Jesse Maitland RSS

    Incremental Load with WHERE NOT EXISTS()  - QVD files

    Jesse Maitland

      Hello,

       

      I am trying to do an incremental load from an SQL Server 2008 database, and an existing QVD file. I have copied the syntax from here. I am trying to do an INSERT with UPDATE

       

      https://help.qlik.com/en-US/sense/1.1/Subsystems/Hub/Content/LoadData/UsingQVDFilesIncrementalLoad.htm

       

      I'm running into an issue with WHERE NOT EXISTS(). My load scripts looks as follows

       

      Table_name:

      SELECT

           ID,

           Value1,

           Value2

      FROM

           Table

      WHERE

           UpdateTime >= '$(LastUpdateTime)';

       

      CONCATENATE LOAD

           ID,

           Value1,

           Value2

      FROM

            [lib://MyQVD/Table_name.qvd](qvd)

      WHERE NOT EXISTS(ID);

       

      STORE Table_name INTO [lib://QVD/Table_name.qvd](qvd);

       

       

      I know for a fact that Table_name returned from the database contains 23 records. I first created the QVD file by omitting the CONCATENATE portion of the above script and successfully stored all 23 records in the QVD.

       

      The problem is when I include the CONCATENATE, and run the load script again, it loads all 23 rows from the QVD file, even though I know for a fact that the primary key ID exists in the first set loaded from the SQL script. This leads me to believe that there is either a problem with my WHERE clause, or the EXISTS() function.

       

      I also sometimes get the script to return a strange number of rows from the QVD. I've run the above script and know that the SQL server has returned 0 rows, however the CONCATENATE LOAD portion will strangely read 7 rows from the QVD file, where it should actually be reading 23 rows in this instance.

       

      perhaps I don't understand how the EXISTS() function works?

       

      I created the QVD files with Qlik Sense 3.1

       

      Thank you,

       

      Jesse