Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I have a table within the data warehouse which has a datetime field (upload_datetime)
As it stands, I have just done an insert of data so I have all of the data as 2022-06-20 20:30:00
This table has c. 500k records.
What I need to do is to update or insert on the next run, but don't want to have to process the full data again if at all possible. What I was hoping to do would be to just update or insert where the datetime field field is greater than that last datetime (i.e 2022-06-20 20:30:00)
Is that possible please, or if not, how can I better the performance of each refresh as this table will update every half an hour and to check back through that many records is too time consuming.
This is my job design: TDBInput ---------> TMap ---------> TDBOutput
Any help will be much appreciated.
Hi
In the beginning of job, select the last datetime(i.e 2022-06-20 20:30:00) from target table and store the value to a context variable, this value will be used as filter condition for input data, provided that the input data also has a datetime field.
Regards
Shong
Hi
In the beginning of job, select the last datetime(i.e 2022-06-20 20:30:00) from target table and store the value to a context variable, this value will be used as filter condition for input data, provided that the input data also has a datetime field.
Regards
Shong