Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi,
I use MongoDB to read input and push formatted data to Postgres tables. I have constraints set for my tables viz. unique, char length for varchar e.g. varchar (64), foreign key etc.
Currently, if an error occurs due to any of these set constraints, entire batch of input data is skipped from inserting / updating to table and the ETL job exits.
Is there a way to skip just the erroneous records and process the remaining input data to be inserted / updated.
In the Open Studio you can do following:
Write you new records in a staging table without any constraints.
Now use the component tPostgresTableTransfer: https://github.com/jlolling/talendcomp_tDBTableTransfer/releases
Documentation: https://github.com/jlolling/talendcomp_tDBTableTransfer/blob/master/doc/tPostgresqlTableTransfer.pdf
This component has the purpose to transfer very fast the data from one table into another table.
It can use as source any database but need a PostgreSQL as target.
You will find the option "On Conflict" what should be done in case of records already exists in the target.
Here you can define raise error, ignore or update.
The component expects from the source the same columns as exists in the target, in case some are missing, the will be ignored. If you want to take care every column will be transferred, set Strict Mode.