Skip to main content
Announcements
UPGRADE ADVISORY for Qlik Replicate 2024.5: Read More
cancel
Showing results for 
Search instead for 
Did you mean: 
PapaC
Contributor II
Contributor II

Qlik Replicate Oracle table to SQL Server

I have set up a task to copy from the logstream of an Oracle database to a SQL Server database.  It is throwing the following error:

Failed to commit 10000 rows to target 'tablename'
RetCode: SQL_SUCCESS SqlState: 23000 NativeError: 2627 Message: [Microsoft][ODBC Driver 17
for SQL Server][SQL Server]Violation of PRIMARY KEY constraint 'pk_constraint.
Cannot insert duplicate key in object 'tablename'. The duplicate key value is

I checked the data  in the duplicate key value in the Oracle table for all 7 columns to see if there were actual duplicate.  It returned two rows which where identical except for the last column, which is a timestamp 6.

The timestamp that it is trying to load  is different than what is in the source table for the logstream.

Example: 8/26/2009 5:15:28.503000 AM is showing up in SQL Server as: 1753-01-01 00:00:00.000

The Qlik task was set up to create its own tables and got the above error.  It created the timestamp column as datetime2 (6).  I dropped the table and manually created it as just datetime.  Then I changed the task settings to not drop the table, just truncate it before a full load. It returned the same error.

I did something similar in another project and manually created the table first in the SQL Server database and it works, but I do not see a different in how I set up the task.

Labels (1)
5 Replies
SwathiPulagam
Support
Support

Hi @PapaC ,

 

Oracle is case sensitive and SQL server is not case sensitive.  Please make sure that the Oracle table has 2 similar PK's that differentiate with case sensitivity.

 

Thanks,

Swathi

PapaC
Contributor II
Contributor II
Author

Sorry, Swathi, but I'm not sure that I understand what you are telling me to check.  The PK in the Oracle database is named the same as the PK in the SQL Server database.  Both appear to be in upper case.

As a sidenote, I tried loading a different table where the timestamp is not part of the primary key.  All of the data was copied from the logstream and inserted into the table in SQL Server, but the column with the date and time is wrong.  Example: 8/26/2009 5:15:28.503000 AM is showing up in SQL Server as: 1753-01-01 00:00:00.000

So I don't think that this problem is related to the primary key.  It seems to be a conversion problem from the Oracle Timestamp(6) to SQL Server.  Qlik created the table with a column datetime2(6).  It did not work.  I manually created the table with the column as datetime.  Both column types returned with the same data.  All rows returned with 1753-01-01 00:00:00.000 in the datetime column.

SwathiPulagam
Support
Support

Hi @PapaC ,

 

Now I understand your issue. Oracle table has composite pk and DateTime column also part of the pk. The pk values are differentiated by the DateTime column in oracle(as all other columns in pk have the same values), but the Datetime column gets the value "1753-01-01 00:00:00.000" in the SQL server causing constraint issues. If this is correct then I will create a case and work on it as it requires more troubleshooting.

 

Thanks,

Swathi

PapaC
Contributor II
Contributor II
Author

Yes, that is the situation, but I also get the weird dates in the datetime column for a table that does not include it in the primary key.
Heinvandenheuvel
Specialist III
Specialist III

You have to focus on the failing source Oracle timestamp (with timezone?) to MS SQL  datetime2 conversion.

You should isolate that with a simple sample table with  just a few rows and perhaps with ID, TimeStamp, and 'Comment' field. Put the timestamp value as text in the comment column to know what is expected. Make a task with just that table. Does it work? If it fails please provide the task Json export, source table create SQL + sample row load, and reptask_xx.log for the task.

datetime is wrong, deprecated. You need datetime2. Google is your friend.

If you google ""1753-01-01 00:00:00" you'll soon find it is the 'default' or zero (not null) value for an MS SQL datetime2. Therefor it seems the transformation is completely failing. Maybe there is missing information like a timezone? Are there any warning lines in the task 

 

in the base note you wrote "I did something similar in another project and manually created the table first in the SQL Server database and it works, but I do not see a different in how I set up the task." This suggests something is wrong in the new task. Please reduce to the simplest possible for to show the problem and share the task json? That probably means to remove all but 1 table.

Hein.