<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Queries on Qlik Replicate Audit table and task log errors in Qlik Replicate</title>
    <link>https://community.qlik.com/t5/Qlik-Replicate/Queries-on-Qlik-Replicate-Audit-table-and-task-log-errors/m-p/2489101#M13056</link>
    <description>&lt;DIV&gt;Hi, I have a Qlik Replicate unidirectional CDC task which is loading data from SQL server source to snowflake on Azure as target in batch processing mode. I have enabled store changes option and capturing results in an audit table. Could you please help me on below questions?&lt;/DIV&gt;
&lt;DIV&gt;1. Currently the audit table that QLIK Replicate pushing the changes is having below columns&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "task_name"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "stream_position"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "change_seq"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "change_oper"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "schema_name"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "table_name"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "operation"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "transaction_id"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "timestamp"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "change_record"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "bu_change_record"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;I want to load only particular fields statistics into the target audit table. So can we exclude my unnecessary columns like "change_record", "bu_change_record" fields etc..&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;2. Lets consider I am running the task with a latency of 5min, so I need the change processing statistics for all the tables in the task for each run, can we push that to the audit table&lt;/DIV&gt;
&lt;DIV&gt;whenever latency comes to zero which means the batch is applied to the target.&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;3. Qlik throwing these type of errors sometimes while running the task. what does this actually mean and Please help with the resolution and problem handling approach when we face similar issues in future?&lt;/DIV&gt;
&lt;DIV&gt;a) Transaction aborted when accessing versioned row in table 'cdc.lsn_time_mapping' in database 'PROD'. Requested versioned row was not found because the readable secondary access is not allowed for the operation that attempted to create the version. This might be timing related, so try the query again later. Is it a recoverable error?&lt;/DIV&gt;
&lt;DIV&gt;b)Failed to send table 'dbo.Employee' (10) events to changes table&lt;/DIV&gt;
&lt;DIV&gt;c)Error executing data handler The Transaction sorter. Cannot forward transaction&lt;/DIV&gt;
&lt;DIV&gt;Cannot move transaction to file D:\Program Files\Attunity\Replicate\data\tasks\PROD_LOAD/sorter/ars_swap_tr_00000000000000015853.tswp&amp;nbsp;&lt;/DIV&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;4. I have a Qlik Replicate CDC unidirectional task running in batch processing mode from SQL server source to Snowflake on Azure Target UAT database. Now I want to create a new task with new target as&amp;nbsp;Snowflake on Azure Target PROD database but want to resume the new task from where the old existing task was stopped.&lt;BR /&gt;I am aware of the advanced run options, could you please help me how to proceed on this so that we wouldn't miss any transactions. Also, If date and time option to be considered, which time we need to consider...Qlik Server time or local time?&lt;/P&gt;</description>
    <pubDate>Fri, 25 Oct 2024 10:36:54 GMT</pubDate>
    <dc:creator>Dileep_41</dc:creator>
    <dc:date>2024-10-25T10:36:54Z</dc:date>
    <item>
      <title>Queries on Qlik Replicate Audit table and task log errors</title>
      <link>https://community.qlik.com/t5/Qlik-Replicate/Queries-on-Qlik-Replicate-Audit-table-and-task-log-errors/m-p/2489101#M13056</link>
      <description>&lt;DIV&gt;Hi, I have a Qlik Replicate unidirectional CDC task which is loading data from SQL server source to snowflake on Azure as target in batch processing mode. I have enabled store changes option and capturing results in an audit table. Could you please help me on below questions?&lt;/DIV&gt;
&lt;DIV&gt;1. Currently the audit table that QLIK Replicate pushing the changes is having below columns&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "task_name"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "stream_position"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "change_seq"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "change_oper"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "schema_name"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "table_name"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "operation"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "transaction_id"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "timestamp"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "change_record"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;&lt;SPAN&gt; "bu_change_record"&lt;/SPAN&gt;&lt;/DIV&gt;
&lt;DIV&gt;I want to load only particular fields statistics into the target audit table. So can we exclude my unnecessary columns like "change_record", "bu_change_record" fields etc..&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;2. Lets consider I am running the task with a latency of 5min, so I need the change processing statistics for all the tables in the task for each run, can we push that to the audit table&lt;/DIV&gt;
&lt;DIV&gt;whenever latency comes to zero which means the batch is applied to the target.&lt;/DIV&gt;
&lt;DIV&gt;&amp;nbsp;&lt;/DIV&gt;
&lt;DIV&gt;3. Qlik throwing these type of errors sometimes while running the task. what does this actually mean and Please help with the resolution and problem handling approach when we face similar issues in future?&lt;/DIV&gt;
&lt;DIV&gt;a) Transaction aborted when accessing versioned row in table 'cdc.lsn_time_mapping' in database 'PROD'. Requested versioned row was not found because the readable secondary access is not allowed for the operation that attempted to create the version. This might be timing related, so try the query again later. Is it a recoverable error?&lt;/DIV&gt;
&lt;DIV&gt;b)Failed to send table 'dbo.Employee' (10) events to changes table&lt;/DIV&gt;
&lt;DIV&gt;c)Error executing data handler The Transaction sorter. Cannot forward transaction&lt;/DIV&gt;
&lt;DIV&gt;Cannot move transaction to file D:\Program Files\Attunity\Replicate\data\tasks\PROD_LOAD/sorter/ars_swap_tr_00000000000000015853.tswp&amp;nbsp;&lt;/DIV&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;4. I have a Qlik Replicate CDC unidirectional task running in batch processing mode from SQL server source to Snowflake on Azure Target UAT database. Now I want to create a new task with new target as&amp;nbsp;Snowflake on Azure Target PROD database but want to resume the new task from where the old existing task was stopped.&lt;BR /&gt;I am aware of the advanced run options, could you please help me how to proceed on this so that we wouldn't miss any transactions. Also, If date and time option to be considered, which time we need to consider...Qlik Server time or local time?&lt;/P&gt;</description>
      <pubDate>Fri, 25 Oct 2024 10:36:54 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Qlik-Replicate/Queries-on-Qlik-Replicate-Audit-table-and-task-log-errors/m-p/2489101#M13056</guid>
      <dc:creator>Dileep_41</dc:creator>
      <dc:date>2024-10-25T10:36:54Z</dc:date>
    </item>
    <item>
      <title>Re: Queries on Qlik Replicate Audit table and task log errors</title>
      <link>https://community.qlik.com/t5/Qlik-Replicate/Queries-on-Qlik-Replicate-Audit-table-and-task-log-errors/m-p/2489468#M13068</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.qlik.com/t5/user/viewprofilepage/user-id/220728"&gt;@Dileep_41&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;
&lt;P&gt;1. Yes, you can delete the unnecessary columns from the target audit table, and do not recreate the audit table in task settings, eg:&lt;/P&gt;
&lt;P&gt;&lt;span class="lia-inline-image-display-wrapper lia-image-align-inline" image-alt="john_wang_0-1730127189594.png" style="width: 400px;"&gt;&lt;img src="https://community.qlik.com/t5/image/serverpage/image-id/173474iD80B062FDAE0D94A/image-size/medium?v=v2&amp;amp;px=400" role="button" title="john_wang_0-1730127189594.png" alt="john_wang_0-1730127189594.png" /&gt;&lt;/span&gt;&lt;/P&gt;
&lt;P&gt;2. Sorry I did not get the question well.&lt;/P&gt;
&lt;P&gt;3. Seems the data folder storage is full. please open support ticket and provide task Diagnostics Packages.&lt;/P&gt;
&lt;P&gt;4. Please check my comments in article: &lt;A title="Switching from Direct Replication Path to Logstream Without Reloading Data" href="https://community.qlik.com/t5/Qlik-Replicate/Switching-from-Direct-Replication-Path-to-Logstream-Without/m-p/2484596#M12862" target="_blank" rel="noopener"&gt;&lt;SPAN&gt;Switching from Direct Replication Path to Logstream Without Reloading Data&lt;/SPAN&gt;&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;BTW, please open dedicated articles if possible rather than mixed different issues in a single ticket.&lt;/P&gt;
&lt;P&gt;Hope this helps.&lt;/P&gt;
&lt;P&gt;John.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 28 Oct 2024 15:17:14 GMT</pubDate>
      <guid>https://community.qlik.com/t5/Qlik-Replicate/Queries-on-Qlik-Replicate-Audit-table-and-task-log-errors/m-p/2489468#M13068</guid>
      <dc:creator>john_wang</dc:creator>
      <dc:date>2024-10-28T15:17:14Z</dc:date>
    </item>
  </channel>
</rss>

