Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
Yves_s
Contributor III
Contributor III

The task not reload the table

Hello,

we create multiples tasks behind a log stream  staging task.

all these tasks use the same endpoint in upstream and downstream.

most of them work fine

but 2 not load the tables :

00006232: 2021-12-06T07:38:51 [TABLES_MANAGER ]I: The 'Kafka' target endpoint does not support segmented full load. The table 'IFCX'.'PAKCGE' will be loaded normally (tasktablesmanager.c:1137)

before we had tables without the log stream  staging task and works fine.

I don't understand why it doesn't load.

 

 

Labels (2)
1 Solution

Accepted Solutions
Yves_s
Contributor III
Contributor III
Author

Ok I saw with the dba, these tables are an exception, this is normal

View solution in original post

3 Replies
john_wang
Support
Support

Hello @Yves_s ,

From the task log file line #6:

2021-12-06T07:33:21 [TASK_MANAGER    ]I:  Task 'ZH_PAKCGE' running full load and CDC in resume mode

the running mode is RESUME. If the Full Load was done in previous running then Replicate will continue the CDC stage from the previous broken stream position. You may choose run mode to re-start the full load by Task --> Run --> Reload Target. More Run Options reference link.

Regarding the message it means Kafka does not support Segmented Parallel Load. The supported source/target EndPoints lists are here . You can see DB2z is support source endpoint but Kafka is not support target endpoint, that's why you got the information.

Feel free to let me know if you need any additional information.

Regards,

John.

 

Help users find answers! Do not forget to mark a solution that worked for you! If already marked, give it a thumbs up!
Yves_s
Contributor III
Contributor III
Author

my bad,

i give you the wrong log. I am launching a reload, but nothing is loaded.

I do the same for 16 another tasks with the same lss task and the same kafka and it's work.

 

 

Yves_s
Contributor III
Contributor III
Author

Ok I saw with the dba, these tables are an exception, this is normal