Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello Team,
I created a qlik task (CDC_POC) for just capturing the changes(inserts/updates/deletes) from the table WIP_TRANSACTION_ACCOUNTS and store it in __ct table.
Hence I created the task CDC_POC with just store changes enabled. Its already been 15 mins since I started the task , however I don't see any changes being captured in the __ct table. The latency is keeping on increasing and its already 15 mins now.
The source endpoint is Oracle and the target is Oracle.
However, I am using a standby DR (having "Use Archived Redo Logs only") as the source endpoint . Is this the reason why the changes are not being captured? Do we have to use live DB for CDC in store changes option?
I have attached the logs and screenshot of the task.
Could anyone help me on this? @Dana_Baldwin @john_wang
Hello @harikesh_1991 ,
Thank you for opening the post.
We do see that Qlik Replicate has started processing archived redo log files, for example:
00289322: 2026-01-28T09:34:20 [SOURCE_CAPTURE ]I: Start processing archived Redo log sequence 16425 thread 1 name +RECOC2/WWOPRDCPP1/ARCHIVELOG/2026_01_28/thread_1_seq_16425.8126.1223717649 (oradcdc_redo.c:1034)
However, since the task logging level is not set to Verbose, it is difficult to determine additional details from the uploaded task log. To proceed, we recommend the following:
Open a support ticket and include the relevant background information.
Set SOURCE_CAPTURE and TARGET_APPLY logging levels to Verbose, rerun the task for approximately 30 minutes, and then collect the Task Diagnostics Package.
Collect the complete task log file, decrypt it manually, and upload the decrypted task log as well.
Please confirm that the source tables were updated by DML operations during this period.
Finally, please avoid attaching task log files in the community forum in the future, as it is publicly accessible and task logs may contain confidential information.
Best Regards,
John.
Hello @john_wang ,
Sorry I was a bit too quick to raise a query. The task indeed worked as expected.
Since it was set to "use archived redo logs only" at the source endpoint, there was a lag/delay for about 30 mins for the data to start getting picked up.
Also noted, I will ensure that I won't attach the logs anymore.
Regards,
Harikesh OP
Hello @harikesh_1991 ,
Thank you for opening the post.
We do see that Qlik Replicate has started processing archived redo log files, for example:
00289322: 2026-01-28T09:34:20 [SOURCE_CAPTURE ]I: Start processing archived Redo log sequence 16425 thread 1 name +RECOC2/WWOPRDCPP1/ARCHIVELOG/2026_01_28/thread_1_seq_16425.8126.1223717649 (oradcdc_redo.c:1034)
However, since the task logging level is not set to Verbose, it is difficult to determine additional details from the uploaded task log. To proceed, we recommend the following:
Open a support ticket and include the relevant background information.
Set SOURCE_CAPTURE and TARGET_APPLY logging levels to Verbose, rerun the task for approximately 30 minutes, and then collect the Task Diagnostics Package.
Collect the complete task log file, decrypt it manually, and upload the decrypted task log as well.
Please confirm that the source tables were updated by DML operations during this period.
Finally, please avoid attaching task log files in the community forum in the future, as it is publicly accessible and task logs may contain confidential information.
Best Regards,
John.
Hello @john_wang ,
Sorry I was a bit too quick to raise a query. The task indeed worked as expected.
Since it was set to "use archived redo logs only" at the source endpoint, there was a lag/delay for about 30 mins for the data to start getting picked up.
Also noted, I will ensure that I won't attach the logs anymore.
Regards,
Harikesh OP
Glad to hear it works for you! Thank you so much for your support @harikesh_1991