Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
tjoshua52
Contributor
Contributor

LogStream to SQL Server

I have a log stream setup that pulls from SQL Server. It is currently 197 hours behind. I have checked the memory and disk on the Qlik Server and see no issues. There is 50% ram left and no disk queuing. Is there any tuning that needs to be done on the Qlik side? 

It has been processing 124, 076, 911 commands for a week plus.

 

Thanks,

Josh

Labels (4)
2 Replies
David_Fergen
Former Employee
Former Employee

Hi tjoshua52,

 

We would be happy to look into your issue! Could you let us know the target? Also, can you try 

SOURCE_CAPTURE, TARGET_APPLY, PERFORMANCE to TRACE for about 10 Minutes? You will need to look for "one-by-one" in the Target_apply and make sure they match. In addition, are you in Trans Apply or Batch Apply?

 Let us know once you have tried this, and gotten the information for us! 

Thanks,

David

Bill_Steinagle
Support
Support

Josh,

Hi and thank you for the post to the QDI Forums. Most times when a Task gets that far behind it will most likely never catch up and the quickest way to sync the Source and Target would be a full reload of the Tables. This would mean the Replicate Tasks reading from this Log Stream have a full load done and then the Log Stream Task start processing from the end of the Load. The Log Stream Task do a start from Now advanced run option and the Replicate Task do the same and do a Full Reload to sync the source and target. Have to remember if the TLOG and the Backups are available or if they are not after the Latency on the Task this would cause missing data. So the Reload is best option.

Thanks!

Bill