Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
hi,
in my case, i have a QVD database of 3M rows, (shipments)
every day the status of some of these shipments change and needs to be updated into the QVD file
now I incremental load a large file and update the rows, this works but takes way to long (600k rows a day while only 200k are updated)
my table contains 25 fields, the update file contains 7 fields (only 6 fields need to be updated, 1 is a key (same as the QVD file)
in excel it would be index('status', match('Id','Id',0))
Database:
LOAD Id,
[Forwarder Ref],
[Forwarder Track],
[Customer Ref],
[Project Name],
Performance,
[SLA Pickup Date],
[Actual Pickup Date],
[Actual Pickup Time],
[Pickup Postal Code],
[Pickup City],
[Pickup Country],
Forwarder,
[SLA Delivery Date],
[Actual Delivery Date],
[Actual Delivery Time],
[Delivery Name],
[Delivery Postal Code],
[Delivery City],
[Delivery Country],
Status,
[Last Event Date],
[Last Event],
Rootcause,
Responsible
from ****
Update:
LOAD Shipment_ID as Id,
Actual_Delivery_Date as [Actual Delivery Date],
Performance,
Actual_Pickup_Date as [Actual Pickup Date],
Shipment_Status as Status,
Event as [Last Event],
Event_Date as [Last Event Date]
from ***
could someone please help me?
First load the records that need to be updated.
Left join the other 18 (25-7) fields from the database to those records
concatenate the 25 fields from the database using a WHERE NOT EXIST(Id)
If the left join consumes too much time to calculate, you could make two seperate tables for your problem, one containing the fields that need to be updated and first load the updated records and then the 'old' records. And just load the other fields from the database.
Kind Regards
Jan