Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello support team,
I hope you are doing well.
Previously, you provided the following guidance as a precaution when performing Reload Target on a Log Stream Staging task.
Would it be possible for you to explain the reason behind this recommendation?
It has also been observed that the LogStream task is frequently being executed in “Fresh Start” mode (Reload Target).
When performing differential replication, it is recommended to run the LogStream task in “Resume” mode, unless there is a specific reason not to do so.
Best Regards.
I'm not sure it causes direct problems with the tasks but others might be able to comment as well. One thing would be that new timeline folders are created each time you "reload" the staging task which I suspect would leave older timeline folders & files on disk which could waste space.
Thanks,
Dana
Hi @iti-attunity-sup ,
A "Staging Task" replicates changes from the source endpoint to a LogStream file. If you reload the staging task, it will run in "fresh start mode." This means the staging task will begin capturing changes from the current time, skipping any previous changes, which results in data loss.
Regards,
Desmond
It is generally not needed to "reload" the parent/staging task unless new tables have been added/removed or there has been metadata corruption or loss in the log stream files.
When you say that you are frequently performing a fresh start / reload, do you mean of the parent / staging task or the child / replication task? What are the circumstances on which you feel this is needed?
Thanks,
Dana
In the log stream resume mode continues processing from the last saved stream position (Lsn or Scn) which keeps checkpoint from where to start when you resume the task.
Fresh start/Reload in this task does meta data refresh and re reads the logs from the begging or from the defined time stamp.
So, as Dana recommended below, we don't need fresh start unless there are changes to your metadata in the source.
Thanks
Naren
Hi @Dana_Baldwin ,@narendersarva
Thank you for your response.
The specific reasons may vary depending on the customer, so I’m afraid I cannot provide a concrete answer.
I understand that it is generally not necessary to frequently reload the Log Stream Staging task,
but I would like to be able to explain to the customer what kind of potential issues or drawbacks might occur if the task is reloaded repeatedly.
I would appreciate it if you could share any details regarding this.
Best Regards.
I'm not sure it causes direct problems with the tasks but others might be able to comment as well. One thing would be that new timeline folders are created each time you "reload" the staging task which I suspect would leave older timeline folders & files on disk which could waste space.
Thanks,
Dana
Hi @iti-attunity-sup ,
A "Staging Task" replicates changes from the source endpoint to a LogStream file. If you reload the staging task, it will run in "fresh start mode." This means the staging task will begin capturing changes from the current time, skipping any previous changes, which results in data loss.
Regards,
Desmond
Hi Dana,
Thank you for your response.
I found several Timeline folders under the following path:
Attunity\Replicate\data\tasks\<TASK_NAME>\LOG_STREAM
Could you please confirm if this is the correct location for the timeline folders?
Also, are there any concerns with manually deleting the old timeline folders?
Best Regards.
It depends entirely on how the log stream target endpoint is configured, that is how you can tell where the log stream files reside. There is no default location. FYI, it's usually a best practice to have them on a separate disk than the data directory to avoid I/O contention.
No concerns deleting the old timeline folders as you know reloads have happened since they were created.
Thanks,
Dana