Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have created CSV to database table job where multiple columns in table makes the composite primary key.
There are multiple CSV which are to be inserted in same table. when my job runs when a duplicate entry of record comes in a same file, that record is not inserted and remaing all records are inserted. But when duplicate records comes in 2 different csv files then Job stops at the duplicate record I want Job to transfer all the records In all the csv files ignoring if record is duplicate(should not insert duplicate record and continue with remaining) as per primary key in table.
@roshan_wani ,i suggest you to do the do the lookup with target table and csv file and do inner join and take the inner rejects and load to the table.
@roshan_wani ,if you are using the enterprise one you can log from tOracleOutput to error flow to a file.
Hi manodwhb,
Thank you for your prompt answer, but it would be great if you show by some example how to get inner rejections
as I want to store csv records that are not present in lookup table(as per primary key).
Hi,
If you do below method, you can take only new records (which are not present in DB).
Warm Regards,
Nikhil Thampi
Please appreciate our Talend community members by giving Kudos for sharing their time for your query. If your query is answered, please mark the topic as resolved 🙂
@roshan_wani , @nthampi has provide ,that is what i explained.
As in your case 1st occurrence of Key is valid for update or insert , I would prefer to merge all CSV file and using tAggregateRow/tuniqrow to get the First entry(for the key) , then use tDBOutput or Bulk components.