Skip to main content
Announcements
Introducing Qlik Answers: A plug-and-play, Generative AI powered RAG solution. READ ALL ABOUT IT!
cancel
Showing results for 
Search instead for 
Did you mean: 
amat_attunity
Contributor
Contributor

Table Column length is increased at Source and the Table is Part of CDC task

Hi All,

 

Recently one of the table Column length is increased at Source from 60 to 200 and the Table is Part of CDC task, we are replicating the data from Oracle source to Hadoop Target. So as the task is CDC, will the column length change happened at the Source will be automatically replicated to Hadoop CT table ?

5 Replies
SwathiPulagam
Support
Support

Hi @amat_attunity ,

 

Below is the limitation mentioned in the user guide for using Hadoop as a Target endpoint:

Dropping columns and changing column data types or the data type length is not supported and will suspend the table in all cases except for data types that are mapped to STRING. Changes to the data type length of a data type mapped to STRING (e.g. VARCHAR(50) to VARCHAR(100)) will simply be ignored.

 

Below is the user guide link for your reference:

https://help.qlik.com/en-US/replicate/November2023/Content/Replicate/Main/Hadoop/limitations_hadoop....

 

Thanks,

Swathi

john_wang
Support
Support

Hello @amat_attunity ,

Besides @SwathiPulagam comment, the DDL will be ignored with below warning information in task log file (confirmed by Qlik Replicate 2023.5.0.516 + Oracle 12c source + Cloudera 7.1.3 Hadoop target):

2024-01-19T22:13:34:619864 [TARGET_APPLY ]W: Alter table 'TESTCOL', modify column 'NOTES' is not supported (hadoop_apply.c:1167)

The source side DDL (enlarging CHAR column length from 60 to 200) is like:

alter table testcol modify notes char(200);

The __ct table column length was not changed during the task run time.

Hope this helps.

John.

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
SushilKumar
Support
Support

Hello Team,

Could you please test the same Stop the Qlik Replicate task. Login to Attunity or replicate user then run the DDL on -_CT table and target table and resume the task . 

See does it help share the feedback 

Regards,

Sushil Kumar

amat_attunity
Contributor
Contributor
Author

Hi ,

We are not authorized to apply DDL manually in Hadoop( Target). Hence we can't test it. and also , even though the apply DDL works, We already noticed that few records laded with truncated data. Hence we have to go for Full Load.

amat_attunity
Contributor
Contributor
Author

Hi Sushil,

We are not authorized to apply DDL manually in Hadoop( Target). Hence we can't test it. and also , even though the apply DDL works, We already noticed that few records laded with truncated data. Hence we have to go for Full Load.