Unlock a world of possibilities! Login now and discover the exclusive benefits awaiting you.
I have a job that reads data from a csv file and then process it compering it with the data in my database. At this moment my job is processing 700 rows of the csv file in 2min and for me is slow cause i have files with +20.000 rows. My job is something like this:
tFileOutputDelimited > tReplace > 2 DB components to get additional data to the flow > tFilterRow > tMap > DB INSERT
Do somebody know how to improve the performance of the job? When i read the file only using the tFileOutputDelimited it reads the whole file in 3 secs.
Thanks in advance
@Tech Eight , may be you can try with tDBBulk component to insert into DB.
check the below links.
https://help.talend.com/reader/aMa3LeRerDnYLmJvEPq0bw/oFG_MwrzFFCtU1MJVK8Pdw
https://help.talend.com/reader/tXRG~nTonRYUwbOJscDgxw/KS~ToADRI4boTFy9BN2GPA
Thanks,
Manohar