Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Toronto Sept 9th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Narendra2
Contributor
Contributor

Mange date insert / update due to Postgres Constaints Errors

Hi,

I use MongoDB to read input and push formatted data to Postgres tables. I have constraints set for my tables viz. unique, char length for varchar e.g. varchar (64), foreign key etc.

Currently, if an error occurs due to any of these set constraints, entire batch of input data is skipped from inserting / updating to table and the ETL job exits.

Is there a way to skip just the erroneous records and process the remaining input data to be inserted / updated.

Labels (2)
10 Replies
jlolling
Creator III
Creator III

In the Open Studio you can do following:

Write you new records in a staging table without any constraints.

Now use the component tPostgresTableTransfer: https://github.com/jlolling/talendcomp_tDBTableTransfer/releases

Documentation: https://github.com/jlolling/talendcomp_tDBTableTransfer/blob/master/doc/tPostgresqlTableTransfer.pdf

This component has the purpose to transfer very fast the data from one table into another table.

It can use as source any database but need a PostgreSQL as target.

You will find the option "On Conflict" what should be done in case of records already exists in the target.

Here you can define raise error, ignore or update.

The component expects from the source the same columns as exists in the target, in case some are missing, the will be ignored. If you want to take care every column will be transferred, set Strict Mode.