Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I need to load daily data into .QVD files. It's important to load day by day, as new days could contain identical IDs from old files that need to be updated.
What I do currently as pseudo-code:
FOR Every Day
Table:
LOAD Daily_File
CONCATENATE .qvd WHERE ID isn't in File
Store into .qvd
Drop Table
NEXT Day
This works just fine initially to load many days into the .qvd. However, as the .qvd grows bigger and bigger, it's very inefficient/slow to load/save/flush the .qvd 30 times to load a month's data in.
My idea would be to do the above for day 1 only and the following from then on:
FOR Every Day
Table:
LOAD Daily_File
CONCATENATE RESIDENT Table WHERE ID isn't in File
NEXT Day
Store into .qvd
The issue with that is that it creates "Table-1", "Table-2" etc. as the names already exist. Is there a way to open an existing table? Or a better way to solve my problem?
Thanks!
Oli
I suppose that if your code is something like this:
FOR Every Day
Table:
LOAD Daily_File
WHERE Not Exists(ID)
NEXT Day
Store into .qvd
It should work, because QV would automatically concatenate each load into the same table, and would only include the IDs that doesnt yet exist. But the auto concatenate will only happen if the daily files structure (columns) is identical.
Perhaps look at doing an incremental load.
You didn't specify what data source you're working with, so for simplicity sake, I'm going to assume an SQL table.
This process will require you to have a modification date/time information in order to tell when a record was last updated.
Table:
SQL SELECT ID, Daily_File
FROM DB_TABLE
WHERE ModificationTime >= #$(vLastExecTime)#;
concatenate load
ID,
Daily_File
from Table.qvd
where not exists(ID);
store Table into Table.qvd;
** vLastExecTime is the last updated time stamp (i.e. yesterday's date), and can be returned by doing something like max(ModificationTime) from the existing qvd file (i.e. Table.qvd).
This process doesn't not handle deletion when records are deleted from the source data. An additional inner join will need to be performed if this is required.
S.
Sorry Shawn & Fernando to interrupt in this thread.
I've same issue, like not concatenating but new records are adding. But i'm handling with csv to QVD only not SQL server.
Please help to fix this issue.
Thanks
Nandhakumar