Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi team,
In my database data will flow happen Input table->staging table->final dim table. When I do full reload the data comes directly to staging table.
But Here my doubt is when I do advanced reload how the data will flow is it go same as full reload or is it go like normal updating.
Could any body please help on this?
Thanks
Ramu.
Hi @ramu123 ,
I assume that you are inquiring about the Advanced Run Options.
If you choose Advanced Run Options --> Date and Time, the Replicate task will perform a metadata refresh and gather only the changes from that specific point in time. Starting from a timestamp carries the risk of missing changes or receiving duplicate data.
To prevent the aforementioned problems, please follow these steps:
Ensure that you select the correct timestamp. This means using the timestamp when the task was stopped, accounting for any latency at that particular moment.
To prevent duplicates, it is recommended to enable UPSERT.
Thanks,
Swathi
Hello @ramu123 ,
Thanks for post the article.
You are right. The Full reload (reload target) and advanced reload button (Advanced Run Options --> Reload Target) do the same starting, the data flow to the same destination tables.
They are:
Hope this helps.
Regards,
John.
Hi John,
Your stating that when I go for advanced resume option and If I select from date so here the data will flow like directly to staging tables not Input tables? correct me if I'm wrong.
Some times the data is not sync or there is mismatch between source and destination, is there any solution to frequently check or avoid this this situation.
Could you please help on this?
Note: We are using sqlservers for source and destination.
Thanks
Ramu.
Hi @ramu123 ,
I assume that you are inquiring about the Advanced Run Options.
If you choose Advanced Run Options --> Date and Time, the Replicate task will perform a metadata refresh and gather only the changes from that specific point in time. Starting from a timestamp carries the risk of missing changes or receiving duplicate data.
To prevent the aforementioned problems, please follow these steps:
Ensure that you select the correct timestamp. This means using the timestamp when the task was stopped, accounting for any latency at that particular moment.
To prevent duplicates, it is recommended to enable UPSERT.
Thanks,
Swathi
Hello @ramu123
problem with data is not sync or there is mismatch between source and destination, require Log analysis to find the cause. If you have any use case request, you to raise support case TSE's will help you on this.
Thanks,
Sushil Kumar