Skip to main content
Announcements
A fresh, new look for the Data Integration & Quality forums and navigation! Read more about what's changed.
cancel
Showing results for 
Search instead for 
Did you mean: 
RA6
Creator

OutofMemory Exception - Heap space + GC overlaod

Hello all,

 

I am currently working with large data and trying to produce an xml file as final output.
The tests were successful with sample data; however with large data, i am encountering the exception below :

 

- java.lang.OutOfMemoryError: Java heap space
- java.lang.OutOfMemoryError: GC overhead limit exceeded

 

In fact, I got 3 csv files in my job.

 

standard : 88,151 rows (main)
personal : 5,900,000 rows (lookup)
address : 230,000 rows (lookup)

 

1 standard row is linked with 75 personal rows and 15 address rows approx.

 

First of all, i have tried to use a thashoutput to keep the data in memory to see how it processes.

 

0683p000009Lr63.png

 

Secondly i have also tried to generate lookup file in delimited (csv) rather than keeping it in memory:

 

0683p000009Lr9B.png

 

Please note that i cannot use temp directory storage to do the lookup since i am using txmlmap; this option is not available

 

From investigation, i have also tried to increase the JVM arguments:

 

PC RAM : 8 Gb

 

-Xmx4096M

-Xms2048M

 

As a result only 9,500 over 88,151 were processed and thus ending with the mentioned outofmemory exception.

 

Can you advice or propose me something propose? Thank you.

Labels (2)
1 Solution

Accepted Solutions
Anonymous
Not applicable

Why are you converting the document to a String? Is this entirely necessary? This is going to be an expensive process (memory wise) and if you can avoid it, it would be better.

In terms of CSV vs memory, memory is quicker. But you are struggling with memory at the moment, so solve that problem first then look at making it faster.

View solution in original post

4 Replies
Anonymous
Not applicable

This isn't guaranteed to work, but it might help. You say that you cannot use temp directory storage because you are using a tXMLMap. Could you try joining your data in a tMap (and using the temp directory storage), releasing the memory used by the tHash components (by ticking "Clear cache after reading"), filtering your joined data set to just the essential data, then outputting that to a new tHash. Then in another subjob build the XML with the tXMLMap, reading from the tHash.

 

EDIT: One more thing I just remembered (maybe try this first before the other changes), set the "custom the flush buffer size" setting to something like 1000 rows (and experiment). Otherwise the whole data set will end up in memory before it is written to the file.

RA6
Creator
Author

Hello,

 

Thank you for your reply.

 

Can you please confirm me if working directly with csv files is faster than using memory storage (tbuffer, thash)?

 

If so, i am trying to do the lookup using csv files and storing them in temp directory just before the txmlmap.

 

Hope it works.

 

0683p000009Lqx2.png

 

RA6
Creator
Author

I have now the following error while trying to convert the xml document to string : 

 

0683p000009Lqs3.png

 

 

0683p000009Lqwh.png

 

 

Can you advice or propose me some solution please?

Anonymous
Not applicable

Why are you converting the document to a String? Is this entirely necessary? This is going to be an expensive process (memory wise) and if you can avoid it, it would be better.

In terms of CSV vs memory, memory is quicker. But you are struggling with memory at the moment, so solve that problem first then look at making it faster.