Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
Hi All
I have created a job to load mysql table data into postgresql table.I have a unique column constraint for email in the table. If unique constraint is failed in one insert line the entire batch of data following that line will failed to load.Please provide me a solution. I'm loading 1.4 million of data. I'm using talend open studio for data integration
Hi,
Why don't you load the entire data first to a stage table in PostgreSQL? Then you can do a inner join with target table to identify the duplicate records using tPostgreSQLRow or Input component. Once you identify the record which are duplicate, you can either delete them in stage or target based on your use case. Then you should be able to pump data easily to target table.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved
Although formally, in my opinion, this is not a bug. AUTO_INCREMENT = xxx refers to table metadata, not data. I think he needed to get not a bug report, but a feature request to add the -no-autoincrement-value key, and the problem would have been solved long ago.
Hi fdenis,
Thank you for the answer.
Can you please elaborate more about the solution as what component to use? I'm new to talend.
Hi,
Why don't you load the entire data first to a stage table in PostgreSQL? Then you can do a inner join with target table to identify the duplicate records using tPostgreSQLRow or Input component. Once you identify the record which are duplicate, you can either delete them in stage or target based on your use case. Then you should be able to pump data easily to target table.
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved