Duplicate Check between Source File and Target Table
I am having a flow in which I need to identify duplicates in the same file and across files and across the target table and mark the duplicate records with Failed status.
Currently, I am able to identify duplicates in the same file and across files. Here is my flow.
I have 3 files in the source dir. First I get the files one by one and get the respective file names and load an output file with all 3 files merged into one.
Then I use tUniqRow and direct good records with status 'Success' to the target table and direct duplicates with status "Failed" to the same target table.
I also need to check whether the source record already exists in the target and if so, I need to mark that as "Failed" record. How do I do it?
Also, I want the merged file to be created as a new file every time I run the flow. How do I achieve this?