Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik GA: Multivariate Time Series in Qlik Predict: Get Details
cancel
Showing results for 
Search instead for 
Did you mean: 
Rosanero4Ever
Contributor
Contributor

java.lang.OutOfMemoryError and tFileInputExcel

Hi all,
I'm using a tFileInputExcel in order to import an Excel file (about 25k rows and 10 columns) in my DB.
I'm using TOS 5.1.2 on Windows 2008 R2 in a computer provided with 8GB RAM
My memory configuration is the following:
-vmargs
-Xms64m
-Xmx1536m
-XX:MaxPermSize=512m
-Dfile.encoding=UTF-8
Despite of this I have the following exception:
Exception in thread "main" java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space at
I use store data functionality of tMap and commit my data every 1000 rows but I yet have this problem.
I read some posts about this issue and I applied all the advices.
I also edited the memory configuration with Xmx=3096m but...nothing 😞
I wouldn't convert my excel in csv.
So, can anybody help me to find a solution, please?
Labels (3)
19 Replies
Anonymous
Not applicable

Hi,
There are two points which is related to the java.lang.OutOfMemory Error
1. The memory
2. The job you have designed
I have seen that you make a lot of efforts on that, If the job exists tsortrow, tmap, tuniqRo w, consume large amounts of memory, and other components, the data is saved to disk. Would you mind sending your job screenshot to us so that we can see whether there is optimized.
Best regards
Sabrina
Rosanero4Ever
Contributor
Contributor
Author

Hi Sabrina,
as you can see, my job is very simple. tFileList component considers only a file in this moment but I would like consider, in the future, more than one file.
I attached 3 pictures. I hope these are useful to my scope.
Thanks very much for your time.
R
Anonymous
Not applicable

Hi,
My memory configuration is the following:
-vmargs
-Xms64m
-Xmx1536m
-XX:MaxPermSize=512m
-Dfile.encoding=UTF-8

The memory configuration is the whole Talend job not your current job, isn't it?
You should set your current Job in Talend Studio
In the Run viewer, open the Advanced Settings tab and check Use specific JVM arguments.
Allocate more memory by modifying the JVM parameter as pic shown
Try it and tell us your result please
Best regards
Sabrina
Rosanero4Ever
Contributor
Contributor
Author

Hi,
before all I must correct my previous post: the excel file isn't made up by 25k rows, but 400k rows!
Then, I setup my job with 4096M as you can see in the attached picture.
Despite of this i yet have the out of memory error (after about 270k rows)
Have you other solution?
Anonymous
Not applicable

hi all,
how about the tMap ?
do you make "a lot of thing" in it ?
try to just read data and write it as it in your table, to check if tMap is the bottleneck .. or not.
hope it help
regards
laurent
Rosanero4Ever
Contributor
Contributor
Author

Hi,
tMap only map data into the DB. No transformations are done.
I'll try tour advice even if, as I wrote before, tMap is very simple
Rosanero4Ever
Contributor
Contributor
Author

I removed the tMap component and I setup the job as show in the attached picture, but the problem still exists.
Should I give up on the tFileInputExcel component?
Anonymous
Not applicable

Hi,
I didn't find any issue in your job, how about restarting your studio, renewing a job and copying the job into it? I have no better idea, sorry for that.
Best regards
Sabrina
Rosanero4Ever
Contributor
Contributor
Author

Hi Sabrina,
I perfomed all tasks you adviced (restart TOS, create a new job, etc.) but the issue yet exists.
So, I believe tFileInputExcel can be used only for small excel file (it works with an excel file made up of 10k rows).
or an excel file with a version <2007 (with 65535 rows max).
I resign myself to the use of a csv file
Thanks all for helping me.