Do not input private or sensitive data. View Qlik Privacy & Cookie Policy.
Skip to main content

Announcements
See why IDC MarketScape names Qlik a 2025 Leader! Read more
cancel
Showing results for 
Search instead for 
Did you mean: 
Anonymous
Not applicable

OutofMemory Error When considering a huge XML File as source

Hi Shong,
I have a SDMX XML File as my source and earlier xml file size was in kb and able to run the jobs successfully.
Now we got a very huge 1GB XMl file and if run the same job job getting failed and throwing the below error.
Can you please suggest me with proper workarounds and their steps.
Please help me as it is very urgent and and life issue.
Error 1:
===========
Starting job j500_ValidateLoad at 14:51 12/07/2012.
connecting to socket on port 3958
connected
Exception in thread "Thread-7" java.lang.OutOfMemoryError: Java heap space
at java.util.LinkedList.listIterator(LinkedList.java:667)
at java.util.AbstractList.listIterator(AbstractList.java:284)
at java.util.AbstractSequentialList.iterator(AbstractSequentialList.java:222)
at routines.system.RunStat.sendMessages(RunStat.java:248)
at routines.system.RunStat.run(RunStat.java:212)
at java.lang.Thread.run(Thread.java:619)
Exception in thread "main" java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:4042)
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.runJobInTOS(j500_ValidateLoad.java:5436)
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.main(j500_ValidateLoad.java:5113)
Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
disconnected
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2555)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJobInTOS(j100_Read_Files.java:2942)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJob(j100_Read_Files.java:2608)
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:3375)
... 2 more
Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:956)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2546)
... 5 more
Caused by: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:1030)
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJobInTOS(j102_Read_Dataset_XML.java:1498)
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJob(j102_Read_Dataset_XML.java:1282)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:895)
... 6 more
Caused by: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:2345)
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJobInTOS(j140_Import_Segments.java:2767)
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJob(j140_Import_Segments.java:2572)
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:762)
... 9 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at org.dom4j.DocumentFactory.createAttribute(DocumentFactory.java:156)
at org.dom4j.tree.AbstractElement.setAttributes(AbstractElement.java:549)
at org.dom4j.io.SAXContentHandler.addAttributes(SAXContentHandler.java:916)
at org.dom4j.io.SAXContentHandler.startElement(SAXContentHandler.java:249)
at org.apache.xerces.parsers.AbstractSAXParser.startElement(Unknown Source)
at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(Unknown Source)
at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanStartElement(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at org.dom4j.io.SAXReader.read(SAXReader.java:465)
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:1601)
... 12 more
Job j500_ValidateLoad ended at 14:58 12/07/2012.
Error 2:
==========
Starting job j500_ValidateLoad at 15:23 12/07/2012.
connecting to socket on port 3527
connected
Exception in thread "main" java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:4042)
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.runJobInTOS(j500_ValidateLoad.java:5436)
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.main(j500_ValidateLoad.java:5113)
Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
disconnected
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2555)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJobInTOS(j100_Read_Files.java:2942)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.runJob(j100_Read_Files.java:2608)
at datasetimportjobs.j500_validateload_6_0.j500_ValidateLoad.tOracleInput_1Process(j500_ValidateLoad.java:3375)
... 2 more
Caused by: java.lang.Error: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:956)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_1Process(j100_Read_Files.java:2546)
... 5 more
Caused by: java.lang.Error: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:1030)
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJobInTOS(j102_Read_Dataset_XML.java:1498)
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.runJob(j102_Read_Dataset_XML.java:1282)
at datasetimportjobs.j100_read_files_6_1.j100_Read_Files.tRunJob_2Process(j100_Read_Files.java:895)
... 6 more
Caused by: java.lang.Error: java.lang.OutOfMemoryError: Java heap space
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:2345)
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJobInTOS(j140_Import_Segments.java:2767)
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.runJob(j140_Import_Segments.java:2572)
at datasetimportjobs.j102_read_dataset_xml_0_6.j102_Read_Dataset_XML.tFileList_1Process(j102_Read_Dataset_XML.java:762)
... 9 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at org.dom4j.DocumentFactory.createAttribute(DocumentFactory.java:156)
at org.dom4j.tree.AbstractElement.setAttributes(AbstractElement.java:549)
at org.dom4j.io.SAXContentHandler.addAttributes(SAXContentHandler.java:916)
at org.dom4j.io.SAXContentHandler.startElement(SAXContentHandler.java:249)
at org.apache.xerces.parsers.AbstractSAXParser.startElement(Unknown Source)
at org.apache.xerces.parsers.AbstractXMLDocumentParser.emptyElement(Unknown Source)
at org.apache.xerces.impl.XMLNSDocumentScannerImpl.scanStartElement(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source)
at org.dom4j.io.SAXReader.read(SAXReader.java:465)
at datasetimportjobs.j140_import_segments_0_6.j140_Import_Segments.tFileInputXML_1Process(j140_Import_Segments.java:1601)
... 12 more
Exception in thread "Thread-7" java.lang.OutOfMemoryError: Java heap space
at java.util.LinkedList.listIterator(LinkedList.java:667)
at java.util.AbstractList.listIterator(AbstractList.java:284)
at java.util.AbstractSequentialList.iterator(AbstractSequentialList.java:222)
at routines.system.RunStat.sendMessages(RunStat.java:248)
at routines.system.RunStat.run(RunStat.java:212)
at java.lang.Thread.run(Thread.java:619)
Job j500_ValidateLoad ended at 15:25 12/07/2012.
Thanks,
KPK
Labels (4)
15 Replies
Anonymous
Not applicable
Author

Hi KPK
This is a common problem.
What's the volume of your memory?
What you can do now is trying to reduce the consumption of memory and increase the arguments of JVM.
If there is tMap component in your job, you need to use "Store on disk" feature.
You can increase the arguments of JVM as the following image.
Regards,
Pedro
Anonymous
Not applicable
Author

I have 10 Jobs calling in 1 Parent Job and in all the 10 jobs i have tmaps...Do i need to use Store on Disk for all the 10 Jobs or is it okay if i use in Parent Job.
Anonymous
Not applicable
Author

hi all,

semms that you're using dom4j
try to use SAX parser ( advanced setting) .. it could solve problem for huge xml file. ...but it's slower 0683p000009MACn.png
regards
laurent
Anonymous
Not applicable
Author

Hi
I agree with Laurent.
All these two features are used to reduce the consuming of JVM memory.
Whether you use "Store on disk" for all child jobs depends on how you balance the performance and memory comsuming.
Regards,
Pedro
Anonymous
Not applicable
Author

Where can i find SAX Parser..Can u write me detailed steps to do that...
Anonymous
Not applicable
Author

under advanced setting of your xml input component.
it's just an option in a list (menu) to choice
Anonymous
Not applicable
Author

I have made the suggested changes and run the Job....Still No Luck........
Can Some one Please help me....
Can we Split Large XML into Small Chunks (Multiple XML Files).......How can we do that...?
What will be the impact if we do like that
Anonymous
Not applicable
Author

like Pedro've said try to change param jvm into .bat or .sh depending on our OS :
java -Xms256M -Xmx1024M ..... etc

can read some data in 5Gb xml file with that config ...
hope it's help
laurent
Anonymous
Not applicable
Author

Can You Please help me with the exact code...