Skip to main content
Announcements
Qlik Connect 2024! Seize endless possibilities! LEARN MORE
cancel
Showing results for 
Search instead for 
Did you mean: 
Gaurav2
Contributor II
Contributor II

Qlik Replicate: Deletion of table from source

We have a scenario, where we have to drop few tables from Source and add them back with same name but column name and datatype can be different.

We are using LSS.
As we have setting, where if "When source table is dropped:", ignored the drop.

So this is what we decided to do:

1) Drop the table from source manually
2) Drop the table and its store change table from target manually

After this, replication task was hung and once we stopped and resumed, it worked fine.

3) Add the table in source

After adding table, we have seen the issue is LSS task itself and we restarted the task from SCN where it stopped and worked fine.

What will be the best process to drop and re-create the table with Qlik?

Labels (3)
3 Replies
Heinvandenheuvel
Specialist II
Specialist II

Hmm, as a first impression you should simply be enabling "When source table is dropped:" to 'drop', that way Replicate knows what to do.

>>> We are using LSS.

What is LSS?  Log Stream? If so then there are always a minimum of 2 tasks parent and children if you like.

>>> "When source table is dropped:", ignored the drop.

In both tasks?

>>> After this, replication task was hung and once we stopped and resumed

Which task source reader or logstream reader? Both?

There is hung and 'appeared to hang but is actively trying something and failing over and over'.

Where there any messages in the log, perhaps after temporarily increasing logging  for 'everything' to verbose?

Wat 'task is running' (from sorter) stil appearing, or the 'performance trace' messages (highly recommended!)

As always please provide source DB (type and version), Target DB (type, version less important), Replicate version.

 

hth,

Hein

 

 

Gaurav2
Contributor II
Contributor II
Author

Hi Hein,

Yes, we are using Log Stream (LSS).

"Source table is dropped: ignored the drop" setting is only in Replication task. There is no such setting in Log Stream task.

 

>>> After this, replication task was hung and once we stopped and resumed --> Replication task

Source and target is Oracle.

 

I am just looking for the best practice to drop the table from Source and Target so new table (with same name but different column) can be create in source and Qlik will replicate it in Target

Dimitri_Remez
Support
Support

Hi Gaurav,

You faced these issues with your tasks because they were missing the table metadata after you dropped and re-created the tables.

This problem occurs when a DDL is performed or when a table is dropped and then recreated (and when the task is down/ stopped).

Basically, when you drop, change the structure and recreate the table, it means that it is a new table even if the name is the same and that the new metadata has to be updated by performing a Reload or resuming from the SCN.

You did the right thing resuming the task from the last stopped stream position. Below are the instructions for the users who may be experiencing the same problem:

Usually, you'll see the below error message in your logs:

----------------
[SOURCE_CAPTURE ]E: Unable to get table definition for table '0'
----------------

The simplest way to resume the task is by starting it from a specific SCN or timestamp, this will cause Replicate to read the metadata for all the tables.

You need to look for the following message in the logs, where you'll find the last stopped stream position:
----------------
[SORTER ]I: Final saved task state. Stream position 00000xxx.00000xxx.00000000.0000.00.0000:00000.000000.00, Source id 000000000000, next Target id 000000000000, confirmed Target id 000000000000, last source timestamp 000000000000000000
----------------

The last stopped stream position is: 00000xxx.00000xxx

You can start the task from the last stopped stream position -> Advanced run options -> Tables are already loaded. Start processing changes from: -> Source change position (e.g. SCN or LSN):

Below are the steps to start the task from the stream position after checking the SCN number:

1) First check the last stopped steam position (00000xxx.00000xxx).

2) Then convert the first two-block hex value into decimal after removing the middle dot.

3) Converted decimal number is your respective SCN number of Oracle.

4) You can cross verify the SCN after converting it into a timestamp from the below command.

select scn_to_timestamp(yourSCN) as timestamp from dual;

5) Then you can resume your task from a particular SCN or from a converted timestamp in the Advance run options.

Once the log stream task is running fine, you can resume all your child tasks with the same timestamp.

You can add the internal parameter called traceDdlChanges in your source endpoint. This way we can track the DDL changes from the source while the task is running.

Kind regards,

Dimitri

If the issue is solved please mark the answer with Accept as Solution.