Skip to main content
Announcements
SYSTEM MAINTENANCE: Thurs., Sept. 19, 1 AM ET, Platform will be unavailable for approx. 60 minutes.
cancel
Showing results for 
Search instead for 
Did you mean: 
userid128223
Creator
Creator

update with new time stamp.

Hi Everyone.

I have a SOURCE qvd with following type of records.

ID     Name     Amount      update_date

123     james     $300       Feb-01-2014    

127     John      $234         Feb-01-2014

547     Alex        $555       Feb-02-2014

New file from sales comes in daily basis with new and corrected records. Therefore I want to update any corrected records and append new records in above qvd.

for example today's sales file contains old & corrected records with update date.

ID     Name     Amount      update_date

123     james     $300       Feb-01-2014    // gets dropped as new entry from same ID but new date exist

123     james     $500       Feb-03-2014

127     John      $234        Feb-01-2014

547     Alex        $555      Feb-02-2014

432     Judy         $800     Feb-10-2014   // gets added as new entry.

My source should get updated with all the new records + all the updated records with latest update date.

Final results.

ID     Name     Amount      update_date

123     james     $500        Feb-03-2014

127     John       $234        Feb-01-2014

547     Alex       $555        Feb-02-2014

432     Judy       $800        Feb-10-2014

thanks

1 Solution

Accepted Solutions
maxgro
MVP
MVP

Final:

NoConcatenate load  *

Resident Source                         // 1*

where Peek(ID) <> ID                  // 2*      

order by ID, update_date desc;    // 3*


1*                                  read from a previous loaded table (resident=already in memory)

2* and 3* :

order by:                        I want to read the record sorted by ID and, for same ID, update_desc descending

where Peek(ID) <> ID      filter: discards rows if the ID is the same of previus read record


example

ID     update_date           Peek(ID)                      

1     31-12-2010                               keep              

1     10-10-2009               1                          

1     1-1-2007                   1                         

2     1-1-2010                   1              keep

2     1-1-2009                   2   

2     1-1-2008                   2   

3     12-10-2014                2              keep

4     10-10-2007                3              keep


result: the 4 rows in bold

from online help

peek(fieldname [ , row [ , tablename ] ] )

Returns the contents of the fieldname in the record specified by row in the internal table tablename. Data are fetched from the associative QlikView database.

Fieldname must be given as a string (e.g. a quoted literal).

Row must be an integer. 0 denotes the first record, 1 the second and so on. Negative numbers indicate order from the end of the table. -1 denotes the last record read.

If no row is stated, -1 is assumed

View solution in original post

18 Replies
maxgro
MVP
MVP

a simple way could be

1) concatenate SOURCE and DAILY files in one table

2) read that table order by ID and date desc and filter using the peek function (keep only the record where ID change)

userid128223
Creator
Creator
Author

Can you please share examples

maxgro
MVP
MVP

SCRIPT

Source:

LOAD * INLINE [

ID,     Name,     Amount,      update_date

123,     james,     300 ,      feb-01-2014   

127,     John ,     234  ,       feb-01-2014

547,     Alex  ,      555 ,      feb-02-2014

];

Concatenate

LOAD * INLINE [ 

ID ,    Name   ,  Amount  ,    update_date

123 ,    james ,    300 ,      feb-01-2014  

123  ,   james  ,   500  ,     feb-03-2014

127   ,  John    ,  234   ,     feb-01-2014

547    , Alex     ,   555  ,    feb-02-2014

432     ,Judy      ,   800  ,   feb-10-2014  

];

Final:

NoConcatenate load

*

Resident Source

where Peek(ID) <> ID

order by ID, update_date desc;

drop table Source;



RESULT

IDNameAmountupdate_date
123james500feb-03-2014
127John234feb-01-2014
432Judy800feb-10-2014
547Alex555feb-02-2014
swuehl
MVP
MVP

So you get a daily delta file, right?

Like for today:

DATA:

LOAD * INLINE [

ID,     Name,     Amount,      update_date

123,     james,     $500,       Feb-03-2014

432,     Judy,         $800,     Feb-10-2014 

];


So, load the new data first, then concatenate your old data from the source.qvd, but exclude the existing ID from the new DATA:


LOAD * INLINE [

ID,     Name,     Amount ,     update_date

123,     james,     $300,       Feb-01-2014   

127,     John,      $234,         Feb-01-2014

547,     Alex,        $555,       Feb-02-2014

] where not exists (ID);

Then store the resulting table back as to your source.qvd.

[Instead of the inline table for demonstration, use the qvd and your DB source as data input, i.e. replace the INLINE loads with FROM loads]

Not applicable

Hi,

If I understand ... you can try "Where not exists()", somenthing like this:

Sales

Load

       ID,

       Name,

       Amount,

       update_date

from newinformation;

Load

       ID,

       Name,

       Amount,

       update_date

from information.qvd // old information

where not exists(ID,ID);

Above example keep the actual information and delete old.

Best regards

userid128223
Creator
Creator
Author

Swuehl

I have tried doing what you did. however since daily file contain 1 records with 2 different time stamp. hence it will fetch both in merge file.

so Daily has

ID,     Name,     Amount,      update_date

123 ,    james ,    300 ,      Feb-01-2014

123,     james,     $500,       Feb-03-2014

432,     Judy,         $800,     Feb-10-2014

];

so if you tried where not exist (ID) it will get both records for James.

Not applicable

hope this will helps you

Table:

Load *

from QVD;

Concatenate

Load *

from source

where not exist (primary key);

maxgro
MVP
MVP

I supposed he has duplicated ID (123) also in daily data

userid128223
Creator
Creator
Author

yes you are correct. daily data comes with duplicate ID and new info including updated time date stamp.