Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello guys
Are there any drawbacks for using LogStream? I know it is used mainly relieving stress on the source whenever replicating to multiple different databases. but it also bring some other good benefits, like having the CDC changes stored on the server for configurable period. So my question is, are there are drawbacks for using it? Should i always consider its usage, even if replicating for just two different databases?
Are there drawbacks performance/memory wise?
Kind regards!
Hello @guilherme-matte ,
Thanks for reaching out to Qlik Support ,
We highly recommend you to go thru the below user guide for more information on how log stream task works and the limitation within it
Yes, you are right it is efficient and low-impact: Logstream task uses the transaction logs of the source database, which minimizes the impact on the source system's performance. It doesn't require full table scans or triggers on tables, resulting in efficient data replication.
Below are some limitation which you can find in the user guide
Below are few points which needs to be considered
It's important to evaluate your specific requirements, database environment, and available resources to determine if the Logstream task in Qlik Replicate is the right solution for your data replication needs.
Increased resource requirements: The Logstream task captures changes in real-time, which requires additional system resources, such as CPU and memory, to process and replicate the data. Consider the resource requirements and ensure the system has enough capacity to handle the workload.
Compatibility limitations: While Qlik Replicate supports various databases, certain versions or editions of databases might have compatibility limitations with the Logstream task. It's important to ensure compatibility before implementing the Logstream task.
We hope the above was helpful and remain at your disposal for any other questions, doubts or concerns you may have.
Thanks & Regards,
Arun
Hello Team,
The drawback we see from using the log stream is that we see an increase in complexity because replication would be split into 2 stages instead of directly from source to target.
Regards,
Shivananda
Hello @guilherme-matte ,
Thanks for reaching out to Qlik Support ,
We highly recommend you to go thru the below user guide for more information on how log stream task works and the limitation within it
Yes, you are right it is efficient and low-impact: Logstream task uses the transaction logs of the source database, which minimizes the impact on the source system's performance. It doesn't require full table scans or triggers on tables, resulting in efficient data replication.
Below are some limitation which you can find in the user guide
Below are few points which needs to be considered
It's important to evaluate your specific requirements, database environment, and available resources to determine if the Logstream task in Qlik Replicate is the right solution for your data replication needs.
Increased resource requirements: The Logstream task captures changes in real-time, which requires additional system resources, such as CPU and memory, to process and replicate the data. Consider the resource requirements and ensure the system has enough capacity to handle the workload.
Compatibility limitations: While Qlik Replicate supports various databases, certain versions or editions of databases might have compatibility limitations with the Logstream task. It's important to ensure compatibility before implementing the Logstream task.
We hope the above was helpful and remain at your disposal for any other questions, doubts or concerns you may have.
Thanks & Regards,
Arun
Thank you Arun for the details. We had a Use case where my source is EDB PG and multiple targets ... In log streaming Staging task is default transactional apply ... whereas PG recommends "batch optimized apply" in general.
-> If i change the "batch optimized apply" to " transactional apply" in my child task get into latency issues ... If keep batch optimized apply in child task may encountering duplicates issues as Staging task in transactional apply ....
-> Do we have any performance techniques for a Log streaming Task with Referential integrity (FK & PK relations) where we noticed some deletes are not happening ....