Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

OutOfmemory with big Lookup-Table

I work with a 6 million row table (Oracle), where i need to update only parts of a column (parts of string will be replaced).
Using the table as a lookup table input (tMap) the job quits after roughly 1 Mio read entries (the first step, which is executed)
Does anybody has a workaround for that ?
cheers, Benjamin
(working on a 2GB windows XP-machine)
Exception in thread "Thread-0" java.lang.OutOfMemoryError: Java heap space
at java.nio.CharBuffer.wrap(Unknown Source)
at sun.nio.cs.StreamEncoder.implWrite(Unknown Source)
at sun.nio.cs.StreamEncoder.write(Unknown Source)
at java.io.OutputStreamWriter.write(Unknown Source)
at java.io.BufferedWriter.flushBuffer(Unknown Source)
at java.io.BufferedWriter.flush(Unknown Source)
at java.io.PrintWriter.newLine(Unknown Source)
at java.io.PrintWriter.println(Unknown Source)
at java.io.PrintWriter.println(Unknown Source)
at routines.system.RunStat.sendMessages(RunStat.java:131)
at routines.system.RunStat.run(RunStat.java:104)
at java.lang.Thread.run(Unknown Source)
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOfRange(Unknown Source)
at java.lang.String.<init>(Unknown Source)
at ......
Labels (3)
20 Replies
Anonymous
Not applicable
Author

Hi amamount
Did you see my remark on a possible bug in 2.4:
"I got a compilation error when using the Lookup Table Row in a Var Expression. the "rowX could not be found""
Benjamin
amaumont
Contributor III
Contributor III

Indeed, I forget this remark "rowX could not be found".
For now temp files are not deleted if a fatal error occurs.
About data test, I think I found cause of errors, yet it may remain other cases.
You can follow the 4271, it is very poor commented but I can't detail it more for now.
amaumont
Contributor III
Contributor III

Could you be more precise about your remark "rowX could not be found".
Can you create a Bugtrack with a simple example job ?
Maybe problem is already fixed, but i would ensure it.
Tank you Benjamin.
Anonymous
Not applicable
Author

Hi amamount
Great! I saw you could fix the memory bug, as well I appreciated the Performance Study! Thank You
let me just come back to the problem above:
I give you a few pictures to explain it. in my case row3 gives my the java error.
cheers
Benjamin
amaumont
Contributor III
Contributor III

Thank you very much for all these details.
I reproduced your problem on 2.4.0 easily, then I imported the built job into a current dev version, but in this case no compilation error occurs.
You can look at the 4474 that I resolved.
Anonymous
Not applicable
Author

Hi
TIS 2.4.1 produces:
Exception in component tMap_5_TMAP_IN
java.lang.RuntimeException: java.io.FileNotFoundException: /home/dob/WM_test/FSK-BHF/temp/WM_STAGE1_DISTRIBUTE_tMapData_row9_TEMP_792.bin (Too many open files)
with roughly 1300 files in the temp area with big Lookup Tables 10'000'000 rows and more
cheers
Benjamin
Anonymous
Not applicable
Author

Hi
I would like to add some observation to the above:
Java heap was set to 1024 and I increased buffer size from 100'000 to 1'000'000 to get bigger files.
Now the jobs fails with "Exception in thread "Thread-0" java.lang.OutOfMemoryError: Java heap space"
Now I set the -Xmx2048M in the start.script trying to avoid the heap space error. Result is still pending......
This forum entry is to show, that the Lookup Concept from tMap urgently needs an extension to the exsiiting concept, as I mentioned already in one of the above contributions.
For Each Row in Input stream Lookup the record in DB, fetch it and allow some operation (java-expression) to join some data.
Is there any other component-combination to allow the above scenario, without tMap ?
I am grateful for any short term workaround.
Cheers
Benjamin
amaumont
Contributor III
Contributor III

Indeed, to prevent the "Too many open files" error you have to increase the "Max buffer size".
About the "Thread-0" java.lang.OutOfMemoryError: Java heap space" error, it may come from:
- any problematics cases such as the 4867 which describes a problem with the FIRST MATCH mode
- other components which need much memory, such as the aggregate or database components which may accumulate a big amount of data
The best way is to use a low "Max buffer size" which allow to use a minimal amount of memory and which generate a number of generated opened files allowed by the system.
It will be interesting for me to have more details on your "Java heap space error".
Anonymous
Not applicable
Author

Hi
Thank you for your reply.
My job was running on a Linux machine, I was not aware, that my file system limits the openfile number to something around 1300.
This is interesting information.
btw. even 2048 MB RAM run into the heap space error. I think 1 mio records in buffer is too much for our tMap component.
btw: I ran into another strange beahviour. tDenormalizeSortedRow seems not to treat the last row. My input has 20 rows (with tSampleRow) and then some Normalisation, selecting and denormalisation leads to only 19 rows left. the last one disappeared somewhere.
Cheers
Benjamin
amaumont
Contributor III
Contributor III

I would like see the complete error message to see if this an algorithm bug or an other problem.
It could be useful to give a screenshot of your job if possible.
About the problem on the tDenormalizeSortedRow, please create a new topic.
Thank you Benjamin.