Skip to main content
Announcements
Introducing a new Enhanced File Management feature in Qlik Cloud! GET THE DETAILS!
cancel
Showing results for 
Search instead for 
Did you mean: 
Tech8
Contributor III
Contributor III

Faster .csv read processing

I have a job that reads data from a csv file and then process it compering it with the data in my database. At this moment my job is processing 700 rows of the csv file in 2min and for me is slow cause i have files with +20.000 rows. My job is something like this:

tFileOutputDelimited > tReplace > 2 DB components to get additional data to the flow > tFilterRow > tMap > DB INSERT

Do somebody know how to improve the performance of the job? When i read the file only using the tFileOutputDelimited it reads the whole file in 3 secs.

Thanks in advance

Labels (3)
1 Reply
manodwhb
Champion II
Champion II

@Tech Eight​ , may be you can try with tDBBulk component to insert into DB.

 

check the below links.

 

https://help.talend.com/reader/aMa3LeRerDnYLmJvEPq0bw/oFG_MwrzFFCtU1MJVK8Pdw

https://help.talend.com/reader/tXRG~nTonRYUwbOJscDgxw/KS~ToADRI4boTFy9BN2GPA

 

Thanks,

Manohar