Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
Qlik Open Lakehouse is Now Generally Available! Discover the key highlights and partner resources here.
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

java.lang.OutOfMemoryError: CG overhead limit exceeded

I've run in the above java memory problems first in TOS 5.6.2 under Win32, irregardless of increasing the limit up to a maximum of 1300, it wasn't helping at all.

So I followed what little I found on this forum about it and switched to the Win64 version (which is version 6.0.0) and the 64 bit Java 7 version and again I could see on the taskmanager that java was quickly running out of memory.

I used the new functionality in version 6.0.0 and activated "Monitor control" under "Memory Run" with a garbage collector pace set to 30 seconds. Still it run out of memory on the second step of my job (in version 5.6.2 it couldn't even read the 13 million records (3 rows) of the lookup table in.

This is the job with the last run information in attachment 1

Attachment 2 is the complete log file.

I'm out of options now, I've set -Xms to 256 and -Xmx to 3064. I have to do two tMap as unfortunately no one came up with an answer on my other question here ( tMap - trying to catch matches, in A but not B and in B but not A).

Any help is appreciated, to me it seems the software has some serious memory problems very quickly. I haven't even started to try to do some ETL on the roughly 11-12 billion monthly data set with 21 variables in there. I thought Talend could handle it but this early failure on just 13 million records and 3 rows makes me wonder how it handles memory consumption.

I hope someone has an answer for me, otherwise it's either give up or ship out some serious $$$ for a proper ETL solution (Informatica, IBM DataStage or even Ab Initio).

Labels (2)
2 Replies
Anonymous
Not applicable
Author

Hi  a4xrbj1,


[color=#5b5b5d][font=Verdana, Helvetica, Arial, sans-serif] We don't see your attachment or screenshot on our side. Could you please check it?[/font][/color]
[color=#5b5b5d][font=Verdana, Helvetica, Arial, sans-serif] In addition, did you get "[/font][/color] out of memory" exception on a specific job or all jobs  in TOS 5.6.2?


Best regards
Sabrina
Anonymous
Not applicable
Author

xdshi wrote:
Hi  a4xrbj1,


[color=#5b5b5d][font=Verdana, Helvetica, Arial, sans-serif] We don't see your attachment or screenshot on our side. Could you please check it?[/font][/color]
[color=#5b5b5d][font=Verdana, Helvetica, Arial, sans-serif] In addition, did you get "[/font][/color] out of memory" exception on a specific job or all jobs  in TOS 5.6.2?


Best regards
Sabrina

Hi Sabrina,
thanks for taking time to look at my post. I've added the screenshots, no idea why they weren't uploaded.
Yes, the "out of memory" happened in the first job that was started in TOS 5.6.2, it's the tNetezzaInput in the lower left corner, now called "Get_CDRICC_data_from_next_day"
It occurred after reading in about 7 million records with 3 variables, it slowed down a lot and then the "out of memory" error happened 1-2 minutes later.
Andreas