Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
Dileep_41
Contributor II
Contributor II

Queries on Qlik Replicate Audit table and task log errors

Hi, I have a Qlik Replicate unidirectional CDC task which is loading data from SQL server source to snowflake on Azure as target in batch processing mode. I have enabled store changes option and capturing results in an audit table. Could you please help me on below questions?
1. Currently the audit table that QLIK Replicate pushing the changes is having below columns
"task_name"
"stream_position"
"change_seq"
"change_oper"
"schema_name"
"table_name"
"operation"
"transaction_id"
"timestamp"
"change_record"
"bu_change_record"
I want to load only particular fields statistics into the target audit table. So can we exclude my unnecessary columns like "change_record", "bu_change_record" fields etc..
 
2. Lets consider I am running the task with a latency of 5min, so I need the change processing statistics for all the tables in the task for each run, can we push that to the audit table
whenever latency comes to zero which means the batch is applied to the target.
 
3. Qlik throwing these type of errors sometimes while running the task. what does this actually mean and Please help with the resolution and problem handling approach when we face similar issues in future?
a) Transaction aborted when accessing versioned row in table 'cdc.lsn_time_mapping' in database 'PROD'. Requested versioned row was not found because the readable secondary access is not allowed for the operation that attempted to create the version. This might be timing related, so try the query again later. Is it a recoverable error?
b)Failed to send table 'dbo.Employee' (10) events to changes table
c)Error executing data handler The Transaction sorter. Cannot forward transaction
Cannot move transaction to file D:\Program Files\Attunity\Replicate\data\tasks\PROD_LOAD/sorter/ars_swap_tr_00000000000000015853.tswp 

 

4. I have a Qlik Replicate CDC unidirectional task running in batch processing mode from SQL server source to Snowflake on Azure Target UAT database. Now I want to create a new task with new target as Snowflake on Azure Target PROD database but want to resume the new task from where the old existing task was stopped.
I am aware of the advanced run options, could you please help me how to proceed on this so that we wouldn't miss any transactions. Also, If date and time option to be considered, which time we need to consider...Qlik Server time or local time?

Labels (2)
1 Solution

Accepted Solutions
john_wang
Support
Support

Hello @Dileep_41 ,

1. Yes, you can delete the unnecessary columns from the target audit table, and do not recreate the audit table in task settings, eg:

john_wang_0-1730127189594.png

2. Sorry I did not get the question well.

3. Seems the data folder storage is full. please open support ticket and provide task Diagnostics Packages.

4. Please check my comments in article: Switching from Direct Replication Path to Logstream Without Reloading Data.

BTW, please open dedicated articles if possible rather than mixed different issues in a single ticket.

Hope this helps.

John.

 

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!

View solution in original post

1 Reply
john_wang
Support
Support

Hello @Dileep_41 ,

1. Yes, you can delete the unnecessary columns from the target audit table, and do not recreate the audit table in task settings, eg:

john_wang_0-1730127189594.png

2. Sorry I did not get the question well.

3. Seems the data folder storage is full. please open support ticket and provide task Diagnostics Packages.

4. Please check my comments in article: Switching from Direct Replication Path to Logstream Without Reloading Data.

BTW, please open dedicated articles if possible rather than mixed different issues in a single ticket.

Hope this helps.

John.

 

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!