Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
guilherme-matte
Partner - Creator
Partner - Creator

LogStream Usage

Hello guys

Are there any drawbacks for using LogStream? I know it is used mainly relieving stress on the source whenever replicating to multiple different databases. but it also bring some other good benefits, like having the CDC changes stored on the server for configurable period. So my question is, are there are drawbacks for using it? Should i always consider its usage, even if replicating for just two different databases?

Are there drawbacks performance/memory wise?

Kind regards!

Labels (2)
1 Solution

Accepted Solutions
Arun_Arasu
Support
Support

Hello @guilherme-matte ,

 

Thanks for reaching out to Qlik Support , 

We highly recommend you to go thru the below user guide for more information on how log stream task works and the limitation within it 

https://help.qlik.com/en-US/replicate/May2023/Content/Replicate/Main/Log%20Stream%20Staging/intro.ht...

Yes, you are right it is efficient and low-impact: Logstream task uses the transaction logs of the source database, which minimizes the impact on the source system's performance. It doesn't require full table scans or triggers on tables, resulting in efficient data replication.

Below are some limitation which you can find in the user guide

  • The Bidirectional replication profile is not supported.
  • In the Target Metadata tab of the Task Settings dialog box, if you want to enable the Replicate LOB columns option, the Limit LOB size to value must be the same as the value specified in the Log Stream Staging task. Note however that if the Replicate LOB columns option is disabled for the Log Stream Staging task, it cannot be enabled in the Replication task.
  • In the Table Settings window, most tabs are available and functional with the exception of the LOB Column Handling tab which is not available (at table level).
  • The source_lookup Data Enrichment function is not supported. For more information on this function, see Data Enrichment functions
  • The Source change position (e.g. SCN or LSN) Advanced Run option is only supported with the Oracle source endpoint.
  • When the Use all table partitions Parallel Load method is enabled, source data in a partition that was created after the Log Stream Staging task started, will not be replicated to the target defined for the Replication task.

 

Below are few points which needs to be considered

It's important to evaluate your specific requirements, database environment, and available resources to determine if the Logstream task in Qlik Replicate is the right solution for your data replication needs.

Increased resource requirements: The Logstream task captures changes in real-time, which requires additional system resources, such as CPU and memory, to process and replicate the data. Consider the resource requirements and ensure the system has enough capacity to handle the workload.

Compatibility limitations: While Qlik Replicate supports various databases, certain versions or editions of databases might have compatibility limitations with the Logstream task. It's important to ensure compatibility before implementing the Logstream task.

We hope the above was helpful and remain at your disposal for any other questions, doubts or concerns you may have.

Thanks & Regards,

Arun

View solution in original post

3 Replies
kng
Support
Support

Hello Team,

 

The drawback we see from using the log stream is that we see an increase in complexity because replication would be split into 2 stages instead of directly from source to target.

 

Regards,

Shivananda

Arun_Arasu
Support
Support

Hello @guilherme-matte ,

 

Thanks for reaching out to Qlik Support , 

We highly recommend you to go thru the below user guide for more information on how log stream task works and the limitation within it 

https://help.qlik.com/en-US/replicate/May2023/Content/Replicate/Main/Log%20Stream%20Staging/intro.ht...

Yes, you are right it is efficient and low-impact: Logstream task uses the transaction logs of the source database, which minimizes the impact on the source system's performance. It doesn't require full table scans or triggers on tables, resulting in efficient data replication.

Below are some limitation which you can find in the user guide

  • The Bidirectional replication profile is not supported.
  • In the Target Metadata tab of the Task Settings dialog box, if you want to enable the Replicate LOB columns option, the Limit LOB size to value must be the same as the value specified in the Log Stream Staging task. Note however that if the Replicate LOB columns option is disabled for the Log Stream Staging task, it cannot be enabled in the Replication task.
  • In the Table Settings window, most tabs are available and functional with the exception of the LOB Column Handling tab which is not available (at table level).
  • The source_lookup Data Enrichment function is not supported. For more information on this function, see Data Enrichment functions
  • The Source change position (e.g. SCN or LSN) Advanced Run option is only supported with the Oracle source endpoint.
  • When the Use all table partitions Parallel Load method is enabled, source data in a partition that was created after the Log Stream Staging task started, will not be replicated to the target defined for the Replication task.

 

Below are few points which needs to be considered

It's important to evaluate your specific requirements, database environment, and available resources to determine if the Logstream task in Qlik Replicate is the right solution for your data replication needs.

Increased resource requirements: The Logstream task captures changes in real-time, which requires additional system resources, such as CPU and memory, to process and replicate the data. Consider the resource requirements and ensure the system has enough capacity to handle the workload.

Compatibility limitations: While Qlik Replicate supports various databases, certain versions or editions of databases might have compatibility limitations with the Logstream task. It's important to ensure compatibility before implementing the Logstream task.

We hope the above was helpful and remain at your disposal for any other questions, doubts or concerns you may have.

Thanks & Regards,

Arun

nareshkumar
Contributor III
Contributor III

Thank you Arun for the details. We had a Use case where my source is EDB PG and multiple targets ... In log streaming Staging task is default transactional apply ... whereas PG recommends "batch optimized apply" in general.

-> If i change the "batch optimized apply" to " transactional apply"  in my child task get into latency issues ... If  keep batch optimized apply in child task may encountering  duplicates issues  as Staging task  in transactional apply .... 

-> Do we have any performance techniques for a Log streaming Task with Referential integrity (FK & PK relations) where we noticed some deletes are not happening ....