Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Dear Community,
Can someone please explain why this error occurs:
- Failed to publish schema message for table PRICETABLE_OLD.
- Failed to publish new schema to subject 'XXXX.PRICETABLE_OLD-value'.
- Confluent server returned error code 40301: 'User is denied operation Write on Subject: XXXX.PRICETABLE_OLD-value'
Oracle DB is defined as the source and Kafka as the target. The task contains 217 tables.
This table PRICETABLE_OLD is not in scope at all, just PRICETABLE.
If this table was renamed, then Qlik replicate would not be able to replicate this table. Or?
Many thanks and best Regards,
Helene
Hi Helen,
Thanks for the additional information. From your description I suspect that the rename of the table PRICETABLE has been renamed to PRICETABLE_OLD did not change its internal object id and may be this was the cause for the access denie error and may be replicate entered a loop of recovery tries.
I also suspect this may fall under the limitation listed in the Replicate documentation:
"The Drop and Create table Target Table Preparation option is not supported"
(https://help.qlik.com/en-US/replicate/May2022/Content/Replicate/Main/Kafka/limitations_kafka.htm#Lim...)
OF course that reload the task started everything from scratch and solved the problem. If you want you can try this scenrio in the test environment or if you wish us to test it please open a case for it so we can allocate time and try to recreate the error you were facing and see if this is indeed the case.
Thanks & regards,
Orit
Hello @HeleneExner , copy @OritA ,
Agree with Orit. Another possibility is that, the schema versions conflicts under the same subject. You may remove it , a command sample (if run it from Kafka server):
$ curl -X DELETE http://localhost:8081/subjects/PRICETABLE_OLD-value
where "localhost" is the host name of the Kakfa cluster name, "8081" is the schema registry port number.
Regards,
John.
Hi,
It seems that you are facing a permission problem. Please open a case and provide the task diagnostic package with the log with the error so we can get more details about the cuase of the problem.
Thanks & regards,
Orit
Many thanks for quick feedback! Unfortunately, the log files are no longer available. Because of this error, so many logs were produced that the disk was full and Qlik aborted all tasks. The service provider then deleted all logs and unfortunately also the transaction file. We had to reload all tables. We want to prevent it from happening in the future, so my question is what situation could lead to this error?
The table PRICETABLE is defined in the task. PRICETABLE_OLD was mentioned in the error message.
The following probably happened in the source database:
1. Table PRICETABLE has been renamed to PRICETABLE_OLD
2. new table PRICETABLE was created
3. Table PRICETABLE_OLD was deleted
How can we prevent Qlik Replicate from aborting replication in such cases? Which task configuration can prevent such errors?
Many thanks und best regards,
Helene
Hi Helen,
Thanks for the additional information. From your description I suspect that the rename of the table PRICETABLE has been renamed to PRICETABLE_OLD did not change its internal object id and may be this was the cause for the access denie error and may be replicate entered a loop of recovery tries.
I also suspect this may fall under the limitation listed in the Replicate documentation:
"The Drop and Create table Target Table Preparation option is not supported"
(https://help.qlik.com/en-US/replicate/May2022/Content/Replicate/Main/Kafka/limitations_kafka.htm#Lim...)
OF course that reload the task started everything from scratch and solved the problem. If you want you can try this scenrio in the test environment or if you wish us to test it please open a case for it so we can allocate time and try to recreate the error you were facing and see if this is indeed the case.
Thanks & regards,
Orit
Hello @HeleneExner , copy @OritA ,
Agree with Orit. Another possibility is that, the schema versions conflicts under the same subject. You may remove it , a command sample (if run it from Kafka server):
$ curl -X DELETE http://localhost:8081/subjects/PRICETABLE_OLD-value
where "localhost" is the host name of the Kakfa cluster name, "8081" is the schema registry port number.
Regards,
John.
Many thanks for quick reply!