Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hello,
We are replicating from Db2 z/OS (v12) to Db2 LUW v (11.5.8) on RHEL8.
One of tables we have has two unique indexes.
As soon as the table gets into the CDC mode after Initial load, we start getting SQL0803N errors.
The table is highly active at source side and get constant DML operations.
We are using UPSERT and Batch Optimized mode.
To get around the error we have created the table with the same indexes as non-unique.
We suspect that the columns in the unique indexes are being updated which is leading to the issue.
We would like to know if there is a way fix the error without having to remove the uniqueness or this is a limitation with ODBC target?
Error detail below-
SQL0803N One or more values in the INSERT statement, UPDATE statement, or
foreign key update caused by a DELETE statement are not valid because the
primary key, unique constraint or unique index identified by "1" constrains
table "TB1" from having duplicate values for the index key.
Hello @Div ,
Welcome to Qlik Community forum and thanks for reaching out here!
First of all this is a data violation error rather than limitation with ODBC target, however I'm confused with the uniqueness of the tables in source side and target side. Would you please elaborate the table PK and/or index in both sides?
BTW, if the source table have more than one Unique Index then Qlik Replicate will replicate and use the first one (by index name in the alphabetic order) (if the table has NO PK). I'm guessing one of the unique index got duplicate rows in INSERT operation for example. I'm assuming the UPSERT mode works in one of the unique index but you are facing data violation in another unique index.
Also please take note the ODBC endpoint is Professional Services engaged per Qlik Replicate support policy and User Guide.
Hope this helps.
John.
Hi John, thanks for the warm welcome.
Apologies for not being clear.
The source table has two unique indexes (no PK) and we intend to keep them on the target as well.
We are seeing errorrs reported by both constraints. The INSERT stmts are reporting problems with Constraint(Index) 2 while the UPDATE stmt are reporting issues with Constraint (Index) 1.
Attaching the error log for your reference.
Hello @Div ,
I deleted the attached task log file as it may contain sensitive information. Please obfuscate sensitive information as needed in the future, or open support ticket and attach Diag Packages and task log files there.
Please allow me some minutes, I will get back to you shortly.
Regards,
John.
Hello @Div ,
You are right, the error reported on both unique indexes. We need additional information to understand the issue:
1- Please provide the source table creation DDL include the 2 unique index definitions
2- A sample DML (an update, or insert) to reproduce the error
3- Set source_capture/target_apply to Verbose, recreate the error and download the Diag Packages . If the task log file size exceeds 10240K then please compress the task log file and attach it again.
4- Decrypt the task log file and attach it as well
Our support team will help you with the issue. BTW, if you can recreate the error by a smaller table with fewer rows then the task log file will be smaller, it's easier for us to analyze.
Thanks,
John.
@john_wang - Just to be clear, you want us to open a support case and provide above information in the case and not in this forum, right?
Hello @Div ,
You are right. The task Diag Packages and task log files may contain sensitive information eg the hostname etc. Please open support case in Salesforce (rather than here).
Thank you,
John.
Hello team,
If our response has been helpful, please consider clicking "Accept as Solution". This will assist other users in easily finding the answer.
Regards,
Sushil Kumar