Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
MY DATA IS
ID | SalesPerson | Country | City |
11 | Kalam | India | Guntur |
2 | Kalam | India | Guntur |
3 | Kalam | India | Guntur |
4 | Kalam | India | Guntur |
5 | Kalam | India | Vijayawada |
6 | Kalam | India | Vijayawada |
7 | Kalam | India | Vijayawada |
8 | Kalam | India | Vijayawada |
1 | Kalam | India | Bangalore |
11 | Kalam | India | Bangalore |
10 | Kalam | India | Bangalore |
NOW HOW TO DO INCREMENTAL LOAD?
Hi Gireesh,
You can achieve the incremental loading by using the concatenation of all columns from the input feed and qvd file and join based on these concatenated columns. Please see below and let me know.
Thanks,
Sreeman.
An incremental load doesn't base its activity on the QVD creation date per se, but on the "Increment" concept: add new records and replace whatever has been changed. That's what you need a datetime value for, to know which records are more recent, and to find out which existing records have been modified.
If you don't have this datetime information, you can only remove deleted rows and add rows that weren't present before the current reload. The first one can be accomplished by performing an INNER JOIN on field ID, the latter can be done by first loading all rows with IDs that are larger than the highest one in the history-QVD, and then adding all from that same history-QVD.
If you do have a real primary key field in your source data, you can make your incremental load complete by using this primary key to do the same. But row modifications won't we easy to spot without a datetime field.
Peter
PS if you do a concatenate of all your fields and check on that concatenated value in your source data, you might as well do a full reload every time.