Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Join us in Bucharest on Sept 18th for Qlik's AI Reality Tour! Register Now
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

tfilefetch - java.io.FileNotFoundException (Too many open files)

Hi,
I have a job, (TIS 4.0.1 - Java), which is meant to retrieve approx 85,000 images from urls, im using the tFileFetch component to do this, however of these 85,000 it did 34,764 then i recieved the below error saying that there are too many open files, the job server is set to allow 1024 open files. the timeout on the tfilefetch component is 5000.
I was hoping not to simply increase the number of allowed open files to get round the problem? Is the component not closing the open files properly? Will reducing the timeout of the component help at all?
I dont know why it states defaultfilename.txt as the filename as it wasn't set to get any file called defaultfilename.txt?
Exception in component tFileFetch_1
java.io.FileNotFoundException: /mnt/datafeeds/step02/6108/defaultfilename.txt (Too many open files)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.(FileOutputStream.java:179)
at java.io.FileOutputStream.(FileOutputStream.java:131)
at spitfire_dev.sf007_dealeredit_0_1.SF007_dealeredit$1tFileFetch_1Thread.run(SF007_dealeredit.java:47951)
at routines.system.ThreadPoolWorker.runIt(TalendThreadPool.java:159)
at routines.system.ThreadPoolWorker.runWork(TalendThreadPool.java:150)
at routines.system.ThreadPoolWorker.access$0(TalendThreadPool.java:145)
at routines.system.ThreadPoolWorker$1.run(TalendThreadPool.java:122)
at java.lang.Thread.run(Thread.java:619)
Thanks in advance for your help.
Labels (2)
12 Replies
Anonymous
Not applicable
Author

Hi dtournant
I can't reproduce the problem at the moment, can you simple your job and help us to reproduce it?
Best regards
Shong
Anonymous
Not applicable
Author

Ok Shong, thanks.
I have try to increase iterate value, and it looks better and same work for tfilecopy but not for TfileFectch.
If some URL are bad , it doesn't go away to the problem ?
Anonymous
Not applicable
Author

I have done a simple job and launch it.
I have at left file with 2500 URL. On the 1002 rows and i get again error "too many open files"